Hi,
I am not sure if this is the right forum for my question, please let me know if it is not.
I have sensor that is supposed to be clamped around a live wire and this will produce some form of signal. The sensor has no power supply and it is rather a cicular magent with a copper coil wrapped around. It also has two wires.
Checking the sensor with my multimeter it shows 49.2 ohms when my digital multimeter ohm is set to 200.
The voltage when set at 200k fluctuates (clamped around a live cable) between -1.5 and 1.5. The fluctuation changes to between -1.8 to 1.8 or -1 to 1 or -0.5 to 0.5 depending on live cable load.
The maximum amps this sensor supports is 70AMPS and minimum that can be detected is 15 watts / 0.065 AMPS.
I need to convert this signal into a voltage between 0 and 3 volts to allow me to make accurate readings of the actual power load on the live cable. So somehow the minimum and maximum of the sensor should map to the 0 to 3 volts range.
I need to interface this sensor with an ADC that only accepts up to 3 volts and is 10 bits i.e. the maximum is 1023 digital input.
I have worked out the table below:
9 8 7 6 5 4 3 2 1 0
Watts 15400 7700 3850 1925 962.50 481.25 240.63 120.31 60.16 30.08
Mv 3000 1500 750 375 187.50 93.75 46.88 23.44 11.72 5.86
Digital 512 256 128 64 32 16 8 4 2 1
Can anyone advice on what I need to do in order to make some sense of the readings. Currently when I read my ADC which is pointing to raw data from sensor I get values fluctuating between 0 and 1023. What I really need is to filter or condition the signal such that it is a true representative of the live cable load.
Many thanks in advance
Regards
Charbel
I am not sure if this is the right forum for my question, please let me know if it is not.
I have sensor that is supposed to be clamped around a live wire and this will produce some form of signal. The sensor has no power supply and it is rather a cicular magent with a copper coil wrapped around. It also has two wires.
Checking the sensor with my multimeter it shows 49.2 ohms when my digital multimeter ohm is set to 200.
The voltage when set at 200k fluctuates (clamped around a live cable) between -1.5 and 1.5. The fluctuation changes to between -1.8 to 1.8 or -1 to 1 or -0.5 to 0.5 depending on live cable load.
The maximum amps this sensor supports is 70AMPS and minimum that can be detected is 15 watts / 0.065 AMPS.
I need to convert this signal into a voltage between 0 and 3 volts to allow me to make accurate readings of the actual power load on the live cable. So somehow the minimum and maximum of the sensor should map to the 0 to 3 volts range.
I need to interface this sensor with an ADC that only accepts up to 3 volts and is 10 bits i.e. the maximum is 1023 digital input.
I have worked out the table below:
9 8 7 6 5 4 3 2 1 0
Watts 15400 7700 3850 1925 962.50 481.25 240.63 120.31 60.16 30.08
Mv 3000 1500 750 375 187.50 93.75 46.88 23.44 11.72 5.86
Digital 512 256 128 64 32 16 8 4 2 1
Can anyone advice on what I need to do in order to make some sense of the readings. Currently when I read my ADC which is pointing to raw data from sensor I get values fluctuating between 0 and 1023. What I really need is to filter or condition the signal such that it is a true representative of the live cable load.
Many thanks in advance
Regards
Charbel