I have a temperature gauge that reads a little low and would like to make it more accurate. I believe a voltage to current converter circuit would work but I need one that increases the current as the voltage decreases. The sender is a negative temperature coefficient thermistor that decreases in resistance as temperature increases, allowing more current to flow into the gauge. I propose to connect the sender to a resistor matching the gauge resistance, about 79 ohms, which would be connected to 12 Volts; then the voltage across the sender would be the input to the converter, which would have an adjustable gain to compensate for the inaccuracy of the sender. The sender resistance is a nominal 240 to 30 ohms. the midscale of 150 degrees on the gauge is about 173 ohms. The current though the gauge is 21 ma at 100 degrees, 60ma at 150 deg, and 110 ma at 240 deg. The gauge currents were measured with a variable pot on the bench. Voltages at the sender range from approx 9.5 volts at 100 deg, 7 volts at 150, and 3.5 volts at 240 deg. All the converter circuits I have been able to find increase current with increases in voltage and so I am asking if the op amp circuits typically used can be re configured to increase current out with decreasing voltage in. Or would I need to place an inverting voltage amp before the converter? The attachment is the general circuit I have in mind but needs to be modified as above. Your advice is appreciated.