You will have several problems starting with quantization errors
which are explained well here. Beyond that you will lack the resolution to see 0 to 10 amps with .005 amp resolution. Going back to what KISS covered.
If I wanted to measure 0 to 10 AAC accurately (well pretty accurately I would start by getting a CT
similar to one of these. That will afford a nice linear output of 0 to 10 amps = 0 to 5 volts if chosen right. Therefore 0.5 mV / mA of current AC. Therefore to accurately measure 5 mA you need to resolve with accuracy 5 * 0.5 = 2.5 mV.
Most uC units offer at best 10 bit resolution. That becomes 1024 quantization levels of measurement. So your coding would look like this:
Full scale range = 0 to 5 volts.
ADC Resolution = 10 bits (2^10) = 1024
ADC Voltage Resolution = 5 / 1024 = .00488 volts or 4.882 mV.
With a 10 bit A/D you will never see or resolve 2.5 mV (your 5 mA)
If you went to a 12 bit A/D with this sort of setup you would now have a resolution of:
5 / 4096 = .00122 volts or 1.2 mV which would work.
Given a choice for a commercial application I would find a 12 bit or greater with 14 to 16 bit preferred ADC designed for the application.
Ron