Okay. Let's say i have am receiving unaliased data samples from a filtered ADC with a bandwidth of 100Hz at a rate 1600 samples per second.
This data needs to fed into a control loop, but the control loop only has a bandwidth (or reaction speed) of 50Hz, so I want to filter this data down to 50Hz before feeding it into the control loop, but *NO DOWNSAMPLING* is occuring. The control loop will still be processing the samples at 1600 samples per second.
My question is, is there any advantage to keeping the ADC output data rate at 1600Hz as opposed to simply lowering it to 200Hz (and having the control loop process the samples at a rate of 200Hz)?
(You are saying that the 1600Hz data rate will preserve the phase data better than the 200Hz data rate?)
Another scenario is if the same 100Hz bandwidth, 1600sps ADC only had it's output data read at 200Hz so that only 1 out of every 8 samples were used and the rest thrown away. Apparently this would cause aliasing and is why decimation filters exist...but I don't understand why throwing away the other 7 samples would cause aliasing in this case since the effective sample rate is 200Hz and the bandwidth of the signal was only 100Hz to begin with.