Hello again. I have a volt meter on a piece of equipment that I am have a problem with it. There are 3 1M 1% 3/4 Watt resistors in series measuring the dc voltage off a transformer. What I thing is happing is that the resistors are changing value when they heat up. The dc input voltage to the resisters is 2230 volts
My question when the meter is reading correctly what would the voltage drop accross the resistors be. and what would be the voltage the meter would be seeing in order for the meter to be correct
Now maybe I should answer the question you asked, instead of the question I read.
You would need to know the resistance of the meter to calculate the voltage divider that's created with the 3 resistors. If the meter resistance was 100k...
Yes, I assume that the meter scale is 3KV full scale. With 3meg in series with the meter movement, the current that flows (ignoring the meter resistance) if 3kV were applied would be 3e3/3e6 = 1e-3 = 1mA. Lots of DC panel meters are 1mA full-scale (1000Ω/V), so this is reasonable.
Each 1megΩ resistor has 1kV across it. That sounds too high for standard resistors. Look up the data sheets for some standard resistors on DigiKey, and they list the max. allowed voltage.
I would convert the multiplier resistor string into 10 each 300K 1/4W metal-film resistors housed inside a chunk of Teflon tubing. That drops the voltage across each resistor to <300V
I just and went and reread the OP. Each resistor has 2230/3 = 743V across it. That means that it is dissipating 743^2/1e6 = 0.55W, which means that 3/4W rated resistor will run quite hot. I would still be inclined to use more resistors in series...
I believe it to be internal arcing. The meter will be reading low I will take a DVM and take a reading at the last resistor where the meter ties in a small spark will happen and the meter will be working good again for awile. As long as I keep the cover off it appears to be ok. This was and existing meter that was working.