Yep. Been there done that (except I used a DAC). Prototype I eluded to a few posts back, is currently in the trash. I could not get under 10mV error at each 1 volt step. All of the readings were close to 50mV error. The error was not linear but absolute per step (ie., varied by step, but always under the actual reading).
While we're talking ADC/DAC. Any ideas on how to keep the 1mV resolution from a 0-5V ADC that must read a 0-15V input signal? I'm thinking I need to subtract the "excess" voltage because a resistor divider will squash the resolution.
(yes, I don't have all the details. I'm not in the office and I'm working on the weekend)