Hi,
Thanks for the fast replies.
I am watching this video:
,
which shows me how to achieve 4-wire resistance measurements with just just a normal multimeter and attempt it, to better understand the concept. From the video the steps are as follows:
1. Use a current limiting power supply, set to a low voltage, connect it to a multimeter in current range and adjust the current to a desired value. This is to get a constant current source.
2. With the setup in (1), connect a resistor, e.g. 1 kOhm, to be tested across the power supply output.
3. Measure the voltage across the resistor using another multimeter
4. The resistance is R = V / I
However I tried it and observe some very strange results. With R = 1kOhm and the power supply current 0.7A, as indicated on the power supply current indicator and on the multimeter when in ampere range, I measure 70mV across the resistor, which means the resistance is 70mV/0.7 A = 0.1 Ohm! However, when I changed the multimeter to measure in mA range, I get 7mA of current, which result in a resistance of 70mV / 7 mA = 10Ohm. Neither is correct. I am using a Fluke 17B to measure current, a Lodestar 8103 power supply and a cheap Victor VC921 to measure voltage.
The same behaviour is achieved with different set of multimeters. The power supply current indicator shows 0.7A in both cases.
Can anyone advice what I am doing wrong here? Why are the current measurements different in different ranges, and why is the resistor value incorrectly calculated?