Hi all!
Attached is a circuit I have designed for measuring temperature with P1K0.232.6W.A.010 RTD sensor, connected in the 4-wire configuration.
U2, U3 and R3 produce a constant current source for the RTD. Theoretically, U2 produces 1.25V and the constant current is around 1.25/3.16k=~396uA. In reality, the current that I measure is 401uA. Anyway, the voltage developed across RTD is first amplified with U5 (with a theoretical gain of [R_gain/100k] + 1 = 2 / In reality R_gain has the value of 99.5k and thus the gain is around 1.995) and then measured with a 12-bit ADC, integrated in the EM250 ZigBee SoC. The measured value is given to me directly in mV, with help of the API offered with EM250 from Ember.
To convert the result from the ADC into Temperature I do the following process:
1) The measured voltage is divided with 1.995 (the actual gain), to find the real voltage value across the RTD.
2) Afterwards, I calculate the resistance value of RTD by dividing the above calculated voltage with the measuring current.
3) To find the temperature, I use the equation of α: α=(RT-R0)/(R0*T)=0.00385
The problem I face is that the measured temperature is quite different from the real. The error is around 6deg. Celsius and I cannot find the reason for that.
Can anyone please help me with this?