Ok, maybe a stupid question but I need to understand this issue. Suppose we have an ADC with 2k input impedance and 25pF input capacitance at every input channel. And suppose we want to connect an analogue voltage source to the ADC with output impedance that equals Rout.
Generally, it is known that when connecting two analogue stages, in order to avoid signal loss, the output impedance of the first stage should be kept minimum around one tenth of the input impedance of the second stage. However, is this true for the ADC case, too? I mean, if the output impedance of the analogue voltage source is higher than (or even around the same) the input impedance of the ADC, is that a problem? Can this difference and problem be overridden by adjusting the sampling time of the ADC to a proper value? Or the accuracy of the measurements taken is significantly large?
Generally, it is known that when connecting two analogue stages, in order to avoid signal loss, the output impedance of the first stage should be kept minimum around one tenth of the input impedance of the second stage. However, is this true for the ADC case, too? I mean, if the output impedance of the analogue voltage source is higher than (or even around the same) the input impedance of the ADC, is that a problem? Can this difference and problem be overridden by adjusting the sampling time of the ADC to a proper value? Or the accuracy of the measurements taken is significantly large?