Suppose we have a 10-bit unipolar ADC with a 2V reference. An analogue input voltage of 2V should give an output of 1023. What happens if the input is, say, 2.5V?
From what I've read, most ADCs plateau out and would give an output of 1023 for all input voltages in the range 2V-2.5V. Is that true of all ADCs? Or do some 'wrap-around' and give spurious lower outputs, e.g. 255, for an input in this range?
If the ADC is a "successive approximation" (which most ones built inside MCU are), then it will output 1023 for all voltages above the reference voltage.