I have a board without some components on it and I have to put them back comparing with a good one. There is one resistor in 2512 package which is marked as R200. The standard reading would give me as a value 20 ohm but when I have measured it a multimeter gives me 1,1 ohm and the LCR gives me 0,2 ohm. I would say it is 0,2 ohm because it goes from mosfet source to ground.
What do you think, is there any scale with this R or it just means 0,xxx?
The LCR meter has the ability to zero or is a 4 terminal measurement, so I'd believe the LCR meter.
0.2 is a standard value.
R us typically used for a decimal point. e.g. 10R2 for 10.2
The multimeter has some lead resistance. e.g. short the leads and measure. It might equal about 1.1-.2 or so.
European-style device marking, becoming more common on US parts especially in SMT where there is so little room. R stands for Resistance, and acts as the decimal point for low values.
5R6 = 5.6 ohms
56K2 - 56.2K ohms
1M2 - 1.2 megohm
Are you measuring resistance with the resistor out of the circuit? If not you are not only measuring the resistance of the resistor, but also of all connected circuitry, which will give you an incorrect value.
At 0.200 ohms, the lead resistance could definitely factor in. Make sure you short the leads together and zero the meter before measuring the resistor.
Are you measuring resistance with the resistor out of the circuit? If not you are not only measuring the resistance of the resistor, but also of all connected circuitry, which will give you an incorrect value.
At 0.200 ohms, the lead resistance could definitely factor in. Make sure you short the leads together and zero the meter before measuring the resistor.