Hi all,
I am designing an application that measures the outputs from ultrasound transducers. The data are to be processed at a low-power MSP430 microcontroller. Before the MCU, a high-speed 12-bit ADC will be used. It's highest possible sampling rate is 1MSPS. I would need some help with the design of the anti-aliasing filter.
I am aware that the optimum solution for the filter characteristics would be such that it's response is down more than 20log(1/4096)dB = -72dB at the Nyquist frequency (500kHz). However, this would reduce the effective bandwidth of the signal from the sensors to less than 200kHz and I'm not yet sure if this acceptable from the application's level. But anyway, if we assume that the attenuation of the filter is not around -72dB but worse, what should be an acceptable level? For example, if the attenuation would be around -40dB, would that be acceptable? Or -put in another way- how much error would be introduced at the digitization if at the Nyquist frequency the loss is only -40dB? How can I calculate that?
Finally, what filter type is the best solution for an anti-aliasing filter? The most standard solution would be the Butterworth, but what about the Chebysev or the Elliptic? I could achieve better attenuation beyond the passband with these filters. But would the ripple at the passband and the larger phase group delay?
I appreciate any help that you can give me with these issues.
Nick
I am designing an application that measures the outputs from ultrasound transducers. The data are to be processed at a low-power MSP430 microcontroller. Before the MCU, a high-speed 12-bit ADC will be used. It's highest possible sampling rate is 1MSPS. I would need some help with the design of the anti-aliasing filter.
I am aware that the optimum solution for the filter characteristics would be such that it's response is down more than 20log(1/4096)dB = -72dB at the Nyquist frequency (500kHz). However, this would reduce the effective bandwidth of the signal from the sensors to less than 200kHz and I'm not yet sure if this acceptable from the application's level. But anyway, if we assume that the attenuation of the filter is not around -72dB but worse, what should be an acceptable level? For example, if the attenuation would be around -40dB, would that be acceptable? Or -put in another way- how much error would be introduced at the digitization if at the Nyquist frequency the loss is only -40dB? How can I calculate that?
Finally, what filter type is the best solution for an anti-aliasing filter? The most standard solution would be the Butterworth, but what about the Chebysev or the Elliptic? I could achieve better attenuation beyond the passband with these filters. But would the ripple at the passband and the larger phase group delay?
I appreciate any help that you can give me with these issues.
Nick