Hi
I am using a dsPIC33F micro to measure energy drawn from a battery. The battery is connected in a solar energy system. The output of a MPPT controller charges the battery during the day and in the night it is fed to a sine wave inverter that drives AC loads. The battery current is measured using a bidirectional current sensor (LEM CAS-6, www.lem.com/docs/products/casr%20series.pdf)., which gives a DC output - 2.5V @ zero current. This voltage is fed to an ADc channel on the dsPIC33 to measure the value of the current.
The output of the current sensor swings from 2.5 to 0V in one direction and 2.5V to 5V in the other direction. When the inverter is turned OFF, the sensor output voltage is steady at a value proportional to the current (100mV/A).
The problem is that when the inverter (whose output is 230V AC, 50Hz) is ON, the current sensor output voltage is getting modulated by a 100 Hz sine wave. This is like an AC signal clamped to a DC offset and not a real AC signal with zero crossing, the voltage does not swing around 2.5V DC. The DC offset and the amplitude of the AC portion vary in proportion to the magnitude of the current.
I am unable to understand how to get real current value from this signal. I am also calculating the energy drawn from the battery by multiplying the battery voltage and the current, so the accuracy of the current measurement is important to get the correct amount.
Any ideas ? Does the RMS value of the signal in this case give the actual current ?
I am using a dsPIC33F micro to measure energy drawn from a battery. The battery is connected in a solar energy system. The output of a MPPT controller charges the battery during the day and in the night it is fed to a sine wave inverter that drives AC loads. The battery current is measured using a bidirectional current sensor (LEM CAS-6, www.lem.com/docs/products/casr%20series.pdf)., which gives a DC output - 2.5V @ zero current. This voltage is fed to an ADc channel on the dsPIC33 to measure the value of the current.
The output of the current sensor swings from 2.5 to 0V in one direction and 2.5V to 5V in the other direction. When the inverter is turned OFF, the sensor output voltage is steady at a value proportional to the current (100mV/A).
The problem is that when the inverter (whose output is 230V AC, 50Hz) is ON, the current sensor output voltage is getting modulated by a 100 Hz sine wave. This is like an AC signal clamped to a DC offset and not a real AC signal with zero crossing, the voltage does not swing around 2.5V DC. The DC offset and the amplitude of the AC portion vary in proportion to the magnitude of the current.
I am unable to understand how to get real current value from this signal. I am also calculating the energy drawn from the battery by multiplying the battery voltage and the current, so the accuracy of the current measurement is important to get the correct amount.
Any ideas ? Does the RMS value of the signal in this case give the actual current ?