Hello electronic experts! I need some help designing an accurate, easy to calibrate voltage and current sensor. It will be used for measuring the power output of a solar panel. The ADC I'm using measures 0 to 3.3 volts. The max output voltage of the solar panel is 14 volts and I will use a high precision resistor network (voltage divider) to condition the signal for the ADC. Here is the problem... I want to measure the current of the panel but not drain too much power from the system by using a big resistor. The two approaches I'm working on are: 1) placing a very low resistance instrument resistor of value .005 ohms in series with the panel and then measuring the small voltage drop, amplifing the signal (0 to 3.4mv) using an opamp to obtain the required 0-3.3 volt level. 2) using a circuit current clamp. I have tested a non-inverting op amp with a gain of about 880 but am wondering how to design the circuit for max accuracy any stability. Do I use high or low resistance values (yes, .1% is a given) for the feedback loop. How do I design a good offset circuit to calibrate the opamp to produce 0 volts when 0 volts are input. What if any caps should I use. I'm bacially looking for practical tips for amp design, not just theory. If anyone has actual tested circuits that would be a great help. I'll take any hints or tricks you all have. Thanks for the help!
Frank