Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

How to accurately sample 0-6V with 0-3V ADC?

Status
Not open for further replies.

eyAyXGhF

Member
Hi all,
I have a voltage from 0-6V that I want to accurately sample using my ARM Cortex M4 that has onboard 16bit ADC's.

I've replaced the onboard reference with an accurate 3.00V reference. Now I'm just wondering what the best way is to scale my input voltage 50% so I can read it properly with the ADC.

The voltage source that I'm wanting to sample is prone to loading, so it's very important that this circuit be very high impedance.

My first idea was to use a simple non-inverting opamp amplifier with gain of exactly to 1/2. But it seems that you can't have a non-inverting amplifier with gain < 1 is this correct? Also according to wikipedia (https://en.wikipedia.org/wiki/Operational_amplifier_applications#Non-inverting_amplifier) there may be a problem with input bias current that will affect the accuracy.

So the only thing I can think of is to use a non-inverting buffer to solve the input impedance requirement, followed by an inverting amplifier with an adjustable (+/- 5%) gain of 1/2 and then going into a second inverting amplifier with gain of 1 to correct the inversion. That just seems complicated and it seems like the potential for a lot of added noise in those 3 stages..

so I'm wondering, what's the better way to do this?

I'm OK with using any type of op-amp, I just happen to have TL07x available right now. Available voltages are +15, -15, +5 and the +3.00 reference voltage going to the ADC.
 
It is trivial to invert the data after the ADC using software. Why are you hung up on a non-inverting amp? Anyway, what is wrong with a voltage follower (hiz input to opamp) followed by a 2:1 resistive divider?
 
Last edited:
It is trivial to invert the data after the ADC using software. Why are you hung up on a non-inverting amp? Anyway, what is wrong with a voltage follower (hiz input to opamp) followed by a 2:1 resistive divider?

Thanks for the reply Mike,
The reason I wanted to invert a second time was because the voltage output is 0 to negative 3V before that. I'm a noob when it comes to this stuff, but my reasoning for not using the non-inverting amp is because I need a gain < 1 and non-inverting amp is more prone to error from input bias current. I could be totally wrong about that though..

Anyways, I like your idea of the voltage follower into a voltage divider. Does the impedance of the ADC input ever change, or change with voltage applied or anything like that?? I really need this to stay linear, so I just want to make sure I'm not opening up a can of worms by assuming anything about the ADC input pin that I shouldnt. Thoughts?
 
This is not as straight forward as you might first think. Measuring 6V to 16 bit accuracy (65536 steps) means you are measuring to a resolution of approximately 100uV (91.5uV in fact). The input offset voltage of your op amp will affect this accuracy, as will its drift. Therefore to measure to half an LSB (50uV, say) your input offset voltage needs to be less than 50uV.

You then have to worry about the accuracy (and drift) of the resistive divider. Even if you buy a 1% accurate resistor, this could be 'out' by a factor of 655 (1% divided by 1/65536). So you need a precision resistive divider. Have a look at this:

https://www.linear.com/product/LT5400

This will get you much closer in accuracy, but your system will still need calibrating.

So... feed the 6V into a buffer (0 Ohm feedback resistor on the op amp, non inverting gain). This will give you the high input impedance and unity gain. Then connect the LT5400 to the output of the op amp and take the centre tap of the LT5400 to give you a divide by 2.

Feed this into your ADC. I am assuming (!) the input of the ADC will not load the resistive divider (hence pull your signal down). If it does, then you will need another op amp to buffer the output of the resistive divider.

The lower you can make the input offset voltage of the op amps, the more accurate your system will be and indeed I think the input offset voltage will be the dominant source of error in your system.

To get true 16 bit performance, you will then need to feed a 16 bit accurate voltage into the input and calibrate your system to give the correct output.

Then there is the drift of the reference, but this is another issue...

It can be done quite easily, but getting true 16 bit resolution will be hard, but hopefully I have outlined the things to watch out for...

(it is interesting to note, I tried downloading the TL074 datasheet from TI's website before writing this reply. It is still trying to download... Does anyone else have a problem with TI's hopeless website??)
 
Last edited:
(it is interesting to note, I tried downloading the TL074 datasheet from TI's website before writing this reply. It is still trying to download... Does anyone else have a problem with TI's hopeless website??)

Hi Simon,

No problem, just download it a second ago at the usual speed.

Is it possible that your Browser is blocking the download.?, I use Fire Fox and it brings up a blocking message which I have to accept before it will allow the download of the PDF from TI.

E.
 
Before we all hung up about the 16bit ADC and its resolution , let me ask, what is the 0V to 6V signal coming from?
 
Before we all hung up about the 16bit ADC and its resolution , let me ask, what is the 0V to 6V signal coming from?

My mistake for not mentioning this in my original post, but I'm not expecting 16bit accuracy from this. I'm hoping to get 13-14bits of usable information, and the most important thing is that it remains linear. Drift over time and temperature is less important than remaining linear.

I guess I'm looking for the "best possible within reason" solution to this.
 
Does the impedance of the ADC input ever change, or change with voltage applied or anything like that?? I really need this to stay linear, so I just want to make sure I'm not opening up a can of worms by assuming anything about the ADC input pin that I shouldnt. Thoughts?
To answer that question, we need to see a datasheet that has the ADC specs.
 
To answer that question, we need to see a datasheet that has the ADC specs.

Hi Ron, I've found the datasheet here: **broken link removed**

Being the largest datasheet I've ever looked at, the only values I could find was Radin (Input Resistance) Typical 2kohm, Max 5kohm. and Cadin (Input Capacitance) Typical 8pF Max 10pF on Page 36.
 
Hi Ron, I've found the datasheet here: **broken link removed**

Being the largest datasheet I've ever looked at, the only values I could find was Radin (Input Resistance) Typical 2kohm, Max 5kohm. and Cadin (Input Capacitance) Typical 8pF Max 10pF on Page 36.
Test your link. It doesn't work for me.
What is the highest frequency of your input signal?
 
Well, the ADC input resistance is 2kΩ typical, 5kΩ max, which is too low and ill-defined to put an attenuator directly on the input. It seems to me that you need a voltage follower, an attenuator, and another voltage follower. Keep in mind that, even if you use two equal-value 0.1% tolerance resistors, you could have an error up to 0.1%, which is about 8 LSBs at 13 bits. Max input offset voltage will be 1.5 times the max input offset voltage for an individual op amp.
 
Well, the ADC input resistance is 2kΩ typical, 5kΩ max, which is too low and ill-defined to put an attenuator directly on the input. It seems to me that you need a voltage follower, an attenuator, and another voltage follower. Keep in mind that, even if you use two equal-value 0.1% tolerance resistors, you could have an error up to 0.1%, which is about 8 LSBs at 13 bits. Max input offset voltage will be 1.5 times the max input offset voltage for an individual op amp.

For the resistor matching, can I also put a trimpot in line so I can calibrate the scale accurately? Say R1 = 10k, R2 = 9.6k + 1K 25turn trimpot. Thanks for this advice, I'll breadboard it tomorrow and see how it performs.
 
For the resistor matching, can I also put a trimpot in line so I can calibrate the scale accurately? Say R1 = 10k, R2 = 9.6k + 1K 25turn trimpot. Thanks for this advice, I'll breadboard it tomorrow and see how it performs.
You can do that if you have a calibration standard. Do you have a high quality DVM?
I would use two equal-value resistors. Also, you want to get as much resolution as you can from the pot. You want it to be the smallest value possible while still covering the worst case resistor values, because the pot will most likely have a worse tempco than the resistors. You want that tempco to be a small contributor to the change in output voltage with temperature.
Use the pot as a true 3-terminal potentiometer, not as a rheostat.
Let's say the smallest value 25 turn pot you can get is 100Ω. If you use 0.1% resistors, get two 50k (or 49.9k), so that your pot just barely covers the worst case values of the resistors, which is when one is 50 ohms high and the other is 50 ohms low. The 100 ohm pot covers that, but just barely, which is good. Similarly, if you use 1% resistors, get two 4.99k units.
Pay attention to resistor tempcos, including that of the pot. They can cause big errors over temperature. If linearity is your biggest concern, maybe that's not a big deal.
 
You can do that if you have a calibration standard. Do you have a high quality DVM?
I would use two equal-value resistors. Also, you want to get as much resolution as you can from the pot. You want it to be the smallest value possible while still covering the worst case resistor values, because the pot will most likely have a worse tempco than the resistors. You want that tempco to be a small contributor to the change in output voltage with temperature.
Use the pot as a true 3-terminal potentiometer, not as a rheostat.
Let's say the smallest value 25 turn pot you can get is 100Ω. If you use 0.1% resistors, get two 50k (or 49.9k), so that your pot just barely covers the worst case values of the resistors, which is when one is 50 ohms high and the other is 50 ohms low. The 100 ohm pot covers that, but just barely, which is good. Similarly, if you use 1% resistors, get two 4.99k units.
Pay attention to resistor tempcos, including that of the pot. They can cause big errors over temperature. If linearity is your biggest concern, maybe that's not a big deal.

Thanks Ron. The Resistors and trimpots that I've used previously both have a +/-100ppm/C tempco. I do have a way to calibrate this scale adjustment so thankfully that's not an issue. I will try with the 100ohm trimpot and 4.99k resistors this weekend and see how it goes.
 
If you have a good inamp, programmable for a gain of 1, you can connect its output to its (-) input and the gain becomes 1/2. The precision resistors, high linearity, and low drift are built in.


divby2.png
 
If you have a good inamp, programmable for a gain of 1, you can connect its output to its (-) input and the gain becomes 1/2. The precision resistors, high linearity, and low drift are built in.


Hi ccurtis, wow very nice! I wasn't expecting that to work, but I just tested it using the standard 3 op-amp instrumentation amplifier and it worked.. I don't have any dedicated instrumentation amp chips, but I noticed digikey carries the Burr Brown AD621 for $7. Datasheet says NONLINEARITY ERROR: 0.001% max, INPUT OFFSET DRIFT: ±2μV/°C, LOW INPUT OFFSET VOLTAGE: ±200μV.
 
Last edited:
Hi ccurtis, wow very nice! I wasn't expecting that to work, but I just tested it using the standard 3 op-amp instrumentation amplifier and it worked.. I don't have any dedicated instrumentation amp chips, but I noticed digikey carries the Burr Brown AD621 for $7. Datasheet says NONLINEARITY ERROR: 0.001% max, INPUT OFFSET DRIFT: ±2μV/°C, LOW INPUT OFFSET VOLTAGE: ±200μV.
You have something mixed up here. AD621 is made by Analog Devices, and it has a minimum gain of 10.
 
You could try the AD620, but LT claims their LT1167 is better (and it's 1/3 the price). But, truly, either of those two are pretty darn good. If I'm figuring it right, with the nonlinearity of either of those two inamps spec'ed in the 10's of parts per million, that should be good enough for a 13 bit DAC, just from the bit count, nevermind if your DAC can actually achieve full 13 bit performance. 1/2^13=0.000122% and 95ppm (worst case nonlinearity for the AD620)=95/1000000=0.000095%
 
Status
Not open for further replies.

Latest threads

Back
Top