I have constructed a metering circuit (pic attached) using one-half of an LM358 as a DC x100 op-amp. The dc output (pin 1) connects directly to a 3-digit voltmeter (1/3rd inch self-type). The source of the voltage to be measured is the cathode of a power-pentode audio-output tube.
The cathode connects to chassis gnd via a 1 ohm resistor, so for example 40 milliamps of tube current produces 40 millivolts across the resistor, is amplified to 4.00 volts (.04 x 100=4.00V) by the Vc to be read as 40 milliamps on the display (the third digit and decimal point are hidden from view).
The circuit works well except the display is not very linear, tending to under-read at the low and high ends of the range. The intended range is 30mA to 99mA. In practice the readings are only accurate between 40mV and 70mV. Below this,
30mV displayed is more near 20mA and 80mV displayed is more like 95mA.
To ensure sufficient output swing I increased the IC vcc from +12v to +20V which improved the linearity,
but only partially. The LM358 is capable of sinking output current but I'm wondering if the connected self-powered voltmeter is drawing too much current, as it has to light its 3 x 7-segment leds alone.
I am going to try splicing in the unused other half of the IC as a voltage follower, but would appreciate any comments as to what other issues may be affecting this design.
Note that the meter, being the type that obtains its operating voltage from the potential being measured. does not illuminate below 3.0V but that is not an issue in this case. The required range is 35 to 99 i.e. 3.5V to 9.9V true. (Sorry if this conversion from mV to volts and then red as mV is confusing, but remember only the first two digits are needed for display).
Many thanks in advance.
Gaz