Hi Anyone. Is the attached photo also a 0-160 DC volts voltmeter ? Can I connect the said RPM meter to my DC PWM motor controller which has an output range of 0-175 DC volts? What does the 1mA mean ?
I think the meter should draw 1mA, which would correspond to 160V full scale, therefore the coil should have 160kohm resistance, which you can easily verify.
I suggest you start testing it with low voltage and go up, just to be sure that someone for example didn´t change the internal resistor to change the range.
I suggest you start testing it with low voltage and go up, just to be sure that someone for example didn´t change the internal resistor to change the range.
I agree. Meters come in two modes. It might be that it works just fine with 160 volts. BUT. It might just have "160" painted on the face but needs a resistor to make it work right. Start out with a 1.5 volt battery. Like a AA battery. Either the meter will move the smallest amount or it will go beyound full scale.
Read the front of the meter bottom right small print. F.S. = full scale 1ma DC 160v. 175v is more than you need for full scale. To synchronize meter with motor use a series variable timmer resistor.
What that means is that a 1mA current will drive the needle to the 100% position or "1800" on the scale. Unless you know the total resistance of the meter assembly, the max voltage that can be applied in unknown.
Please provide a pic of the back of the meter.
You can start by taking a VOM (Volt-Ohm-Meter), put it to the lowest Ohm(s) mode (checking resistance) and briefly test your meter across its two connections on the back. Also note how that affects the meter's needle movement. Move up a range on the VOM if need be.
Let us know what the resistance is so we can better suggest ideas for what the meter might be used for.
As gary350 pointed out, it's clearly labelled as 1mA 160V - so I would expect it to have an internal resistor.
However, it's so trivial to check it's not worth taking any risks, simply measure it's resistance with a multimeter - 160K means it's a 160V voltmeter, less than 1K means it needs an external resistor.
it's a milliammeter. 1mA full scale means it takes 1mA to put the needle on the 1800 mark. i think the 160V means that's the amount of isolation for the meter movement between the armature and the magnet. keep in mind that metering circuits often had very high voltages on them, where you might be reading milliamps at 1kv or more which would require a meter that wouldn't leak high voltage to the front panel. the scale is often made of aluminum, and has mounting screws behind it. if you wanted to measure plate current on a transmitter tube, you couldn't use any run-of-the-mill meter. it had to be a meter that could operate at high voltages without arcing or leakage. an interesting thing i noticed while i was a calibrator in the Army. we would often calibrate meters that were used for measuring voltages in excess of 1kv. some of those meters had a lot of voltage on the movement and even the needle. you could pass your hand in front of the meter and the electrostatic attraction (even through the glass) would move the needle.
I think the meter should draw 1mA, which would correspond to 160V full scale, therefore the coil should have 160kohm resistance, which you can easily verify.
I think the meter should draw 1mA, which would correspond to 160V full scale, therefore the coil should have 160kohm resistance, which you can easily verify.
Yes i agree but i would go for the careful test first also.
Please note:
I have seen meters that are labeled with a voltage but they are really current meters and to get the voltage on the face you must connect the proper resistor in series with the meter. This one is different however because it has got RPM and VOLTAGE on the front.
Some careful testing is strongly recommended.
Everybody seems to be tripping over their feet here. Most DC meter movements are essentially current meters, and can be turned into a voltage meter with an appropriate series resistor. This resistor may be external to the meter (connected in series to the meter terminal) or it may be internal to the meter. You can check if there is an internal resistor by disassembling the meter and removing the meter face, which is usually held in with two tiny screws.
By selecting an appropriate compensating resistor, the meter may be scaled for any desired full-scale voltage. If the meter already has an internal compensating resistor, a little more surgery is involved in replacing this resistor as needed for the desired scaling.
The ARRL Handbook used to have very thorough instructions on how to determine the necessary resistor (based on the meter movement sensitivity and internal resistance) and how to draw a new meter face for the new range (often, the back of the existing meter faceplate is blank, so you can just flip it over and draw your own scale). I don't know if the Handbook still includes this section, but this looks like a good reference.
Everybody seems to be tripping over their feet here. Most DC meter movements are essentially current meters, and can be turned into a voltage meter with an appropriate series resistor. This resistor may be external to the meter (connected in series to the meter terminal) or it may be internal to the meter. You can check if there is an internal resistor by disassembling the meter and removing the meter face, which is usually held in with two tiny screws.
If I recall correctly, the reason they went to all this effort was to prevent damaging a sensitive meter movement with too much current from an ohmmeter. I remember the ARRL (America Radip Relay League = ham radio Bible) article from the dark ages before DVMs were the norm - perhaps the current used by DVMs to measure resistance is so low it's not a concern any more.
If I recall correctly, the reason they went to all this effort was to prevent damaging a sensitive meter movement with too much current from an ohmmeter. I remember the ARRL (America Radip Relay League = ham radio Bible) article from the dark ages before DVMs were the norm - perhaps the current used by DVMs to measure resistance is so low it's not a concern any more.
It's not, and for that matter it wouldn't be a concern with pretty well any remotely 'modern' analogue meter either - at least not for testing a 1mA meter.
Having just written that - I thought, why not try it? - a year or so ago I delivered a washing machine to a lady, and she gave me her dead husbands old multimeter. It was an old Altai HC-213, which was a 1970's 2000 ohms per volt meter, of a very small size, and a small price back then - and it's like brand new, presumably it was hardly ever (or never) used.
Anyway, I know exactly where it is - so I stuck in on the ohms range (there's only one) and stuck a digital meter on mA across the probes - it read 280uA, so no threat to a 1mA meter. The more normal 20,000 ohms per volt analogue meter would presumably read even less current?.
Fairly obviously, why the ARRL book was concerned, was testing a more sensitive meter with a less sensitive one - putting 280uA from my little Altai through a 50uA meter would obviously not be good for it, but as this thread is about a 1mA meter it should be fine with pretty well any meter.
My comments were in part because nobody had really explained very clearly that analog meters measure current and that they are made to measure voltage with the addition of a series resistor. Also, that the meter can be scaled to read any desired voltage by adjusting the value of the series resistor.
To complete the discussion, the meter can also be scaled to read any greater current value by adding a shunt resistor across the terminals (assuming it doesn't have an internal resistor making it a voltmeter).