Trying to get a measurement from a 20kV Negative Ion Generator that is adjustable from 15kV- 20kV. My digital meter is 600v Max. and my analog is 10kV Max. Want to set it to 17k - 18kV
Was thinking the only solution was a Voltage divider. With that said, I need some advise on resistor type 1/2 Watt Carbon / Film? Ceramic power resistors? (I would imagine 1/4 Watt's are out) so I can get the the voltage down to a measurable range.
Or should I just not make the attempt? I figured if the was SAFELY possible, was thinking of making a divider bank similar to a selectable resistor bank.
The only specs I have for the unit is:
Input voltage: DC 12V
Output voltage: 15000V-20000V adjustable
Rated power: 5W
Hi there,
Using a voltage divider in theory works because it lowers the voltage at the test point down to a voltage that differs by a constant scaling factor. If the divider ratio turns out to be say 10, then if you measure 100 volts with the meter then you actually have 1000 volts at the test point. That is of course very simple to understand.
What is not that straightforward is that the impedance (or in this case resistance) of the meter also plays a part in the calculation of the voltage divider resistor values. For a straight up 10 to 1 ratio with a pure voltage divider, we might use a resistor of 9k and 1k to get that 10 to 1 factor, but with the resistance of the meter itself this changes. If the meter resistance happened to be 1k (not likely though) then our measurement would be way off.
If we measured 100 volts with the raw voltage divider we would know we actually had 1000 volts at the test point, but considering the resistance of the meter itself we would only measure about 52.6 volts, and since we would still be using a factor of 10, that would make it look like we only had about 526 volts at the test probe point. That would be a large error.
So how do we correct that? Well, if we had a second meter that we could use to calibrate our voltage divider plus original meter, we would find we could use a factor of 19 instead of 10. When we multiply 52.6 times 19 we get close to 1000 volts as the result which tells us we have about 1000 volts at the test point, which is correct.
But since the meter is 1k, why bother with that second 1k resistor for the voltage divider? In fact, if we use just one single 9k resistor in series with the meter, we already have a voltage divider with one of the resistors (1k) being the meter itself. Now when we measure that 1000 volts we see 100 volts on the meter, and the factor of 10 then applies just like with the raw voltage divider.
That's the traditional way of increasing the range of a voltmeter. You add an additional series resistor in order to form a voltage divider with the meter. You do have to know the resistance of the meter though.
This would be best when you cannot load the voltage source much. If you want to draw the minimum current from the voltage source, then that's the way to do it. That's unless you want to use an amplifier and voltage divider, which is an even better way to do it.
Using an amplifier and resistor voltage divider, you can set the resistance values much higher and therefore load the voltage source even less. Sometimes you will need to load the voltage source though to get the loaded voltage measurement, and in that case you load it with a separate resistor.
The way the larger resistor values are usually made is a bunch of smaller value resistor are connected in series. That's partly because each resistor can only take so much voltage across its two leads.
You also have to be careful about how you mount these resistors. If you use a zig zag pattern you have to make sure that no node in the series string comes too close to another node or else it could cause an arc over across those two points.