How voltage is converted/measured from an analogue value to a digital value?

Status
Not open for further replies.
Given a circuit that has practically 8 Volts or so, between two terminals, how can you measure electronically this voltage and convert into a package of bits, that means a number, to send then to the processor? I understand the sampling, the filtering, but not actually the measuring and converting part. Can you explain it in simple terms? If you had practically to send this information to a integrated circuit that processes the voltage and turn this into a signal how would you do it?
 
Look up Analog to Digital converter (ADC) as that's rather a large subject to discuss in a few words, and there are several methods used for that conversion (successive-approximation and sigma-delta are two of the most common).
Get back with us after you have read a bit about that, if you have further questions.
There are many ICs that can do that from numerous manufacturers.
They very greatly in the speed of the conversion, and how many bits of accuracy/resolution you want.
 
Most microcontrollers (MCUs) already have ADCs built in; the only thing you need to do externally is divide the voltage down with two resistors, so the maximum in to the MCU pin is less that its supply voltage, or the ADC reference voltage (depending on the type).

eg. For a 5V MCU, just two equal resistors in series (say 2 x 1K) from the 8V to 0V, then the MCU connects to 0V and the junction of the two resistors would be at half the input, so fine for connecting to an ADC pin on the MCU.

For a 3.3V one, the upper resistor could be around twice the lower, so the ADC sees roughly 1/3 the original voltage.
Possibly 1.8K and 1K.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…