Joe, trying to make really tiny measurements (less than one ohm) with any kind of meter you get into problems. Problems like meter accuracy, impedance effects, resistance of the leads themselves and from any finger contact you have with them.
For instance, a "good" meter typically has a DC resistance range with an accuracy of 1/2% and the minimum ohms scale may be 200 ohms. (199.9 on the display) That means there is an error of 1 ohm, and you're trying to measure less than half of that. Ain't gonna happen! And that can be a $200 meter.
Even if you move up to a meter with a high precision resistance scale, most meters have a "float" of 2-3 on the least significant (rightmost) digit. So a reading of 0.5 can actually be anything from 0.2 to 0.8 and still be "correct" for the meter.
In order to accurately test for a 0.5 ohm resistance, you'd probably need a meter with a 20 ohm scale (i.e. 19.99 ohms), and it would also have to be a calibrated meter--again more expensive.
So while a consumer grade meter may tell you that the circuit is neither zero ohms nor twenty ohms...you can't really rely on it for telling 0.3 ohms apart from 0.8 ohms. Sometimes they will, often they just can't. It is very much like a GPS
: You need to know that they are never, ever, going to be dead on to the final inch.