zappedagain
Electrical
- Jul 19, 2005
- 1,074
I have a 400 KHz analog channel that I am digitizing with a 16-bit Linear Technolgoies LTC2202 that can sample up to 10 MSPS and has a full power bandwidth of 380 MHz (it is designed for undersampling). Will I get a better SNR if I sample at 10 MSPS and average the ten samples than if I sample at 1 MSPS?
This seems like a no-brainer, because I can sample for 1000nS or take the average of ten 100nS samples; either way I'm looking at the same noise*time product so there is no difference. Or is that the wrong way to look at it?
What makes me wonder is that on a single sample this A/D only has a SNR of about 80 dB in my frequency range; that's 10000:1 (about 13-bits) instead of a full 65536:1 for 16-bits. I've seen that by averaging I can easily push this part past 16-bit performance.
Z
This seems like a no-brainer, because I can sample for 1000nS or take the average of ten 100nS samples; either way I'm looking at the same noise*time product so there is no difference. Or is that the wrong way to look at it?
What makes me wonder is that on a single sample this A/D only has a SNR of about 80 dB in my frequency range; that's 10000:1 (about 13-bits) instead of a full 65536:1 for 16-bits. I've seen that by averaging I can easily push this part past 16-bit performance.
Z