But that was my question, why does it draw internally too much current?
and what if i defined it as tristate?
When you set a port pin as an input it is tristated (not driving the line).(edited)
I did not find documentation supporting the statment that input pins in the undefined voltage region draw more current. Microchip makes many different processors. It could possbily vary from one to the other. I looked at the 16F and 18F only. Without specifing a microprocessor we can only speak about what we generaly do.
Suppose it is not true and you connect a voltage source that varies from 0 to +5 to the input.
Microchip say this about input voltage levels.
The Input Low Voltage (VIL) is the maximum voltage level that will be read as a logic ’0’.
The Input High Voltage (VIH) is the minimum voltage level that will be read as a logic ’1’.
I could not find info on what the input will read if the voltage is in the undefined region between VIL and VIH. But the input is digitial, when you read the bit it must be a 1 or a 0.
A 0 may be seen between 0V and VIH.
A 1 may be seen between VIL and +5V.
The problem is that the majority of the band between 0 and +5V we do not know if it will read 0 or 1. (possibly because it is a no no). Even if there is no harm the value read on the input is meaningless, or nearly so.
Digital inputs are expected to rapidly change between hi and lo states. By definition the inputs should see voltages in the undefined region only during the rapid transitions.
By placing an analog input on a digital pin you are asking the input to
operate outside the design parameters. Perhaps it does no harm, or maybe it is like driving a car as fast as it will go for hours on end.
As I said in an earlier post. I am a digitial guy, software engineer by training. The EE's here know better then I do.
3v0