nyquist vs averaging
nyquist vs averaging
(OP)
Suppose the following:
1. You have an infinite amount
of periodic data available for post processing.
2. The information of interest is at a
frequency higher than your data-acq card can sample.
3. You are able to cut the data into an infinite number
of single cycle data sets.
4. You interpolate the data using a high order scheme
and double the number of points.
5. You then average an "infinite" number of
resampled-single-cycle data sets into one "super-sample".
Question:
Does this "super-sample" contain information
beyond what the original sampling rate would normally allow?
Does the averageing of interpolated data get you a little more?
Does this have a name, is this ever done?
Thanks in advance.
1. You have an infinite amount
of periodic data available for post processing.
2. The information of interest is at a
frequency higher than your data-acq card can sample.
3. You are able to cut the data into an infinite number
of single cycle data sets.
4. You interpolate the data using a high order scheme
and double the number of points.
5. You then average an "infinite" number of
resampled-single-cycle data sets into one "super-sample".
Question:
Does this "super-sample" contain information
beyond what the original sampling rate would normally allow?
Does the averageing of interpolated data get you a little more?
Does this have a name, is this ever done?
Thanks in advance.





RE: nyquist vs averaging
Many sampling scopes and so-called "box-car" averaging use this technique. There used to be sampling scopes that sampled at 40 MHz on periodic signals while the scope had 1 GHz analog bandwidth. Therefore, it could sample its way into 1 GHz bandwidth periodic signals. However, if you tried that with a 40 MHz analog bandwidth, you'd get pretty much bupkis.
Micro-dither and Super-resolution (http://en.wikipedia.org/wiki/Super-resolution) are similar applications.
TTFN
FAQ731-376: Eng-Tips.com Forum Policies
RE: nyquist vs averaging
beyond what the original sampling rate would normally allow? "
Also check out zoom FFT
Cheers
Greg Locock
Please see FAQ731-376: Eng-Tips.com Forum Policies for tips on how to make the best use of Eng-Tips.
RE: nyquist vs averaging
Under those conditions, the reconstructed signal would accurately represent the signal, but not the noise, as if it had been sampled in real time at the effective higher sampling rate.
Eventually, those scopes faded away as memories got fast enough to store the data in real time.
TTFN
FAQ731-376: Eng-Tips.com Forum Policies
RE: nyquist vs averaging
Cheers
Greg Locock
Please see FAQ731-376: Eng-Tips.com Forum Policies for tips on how to make the best use of Eng-Tips.
RE: nyquist vs averaging
Thank you all for your input.
Bandwidth, I'll look into that,
there's a lowpass filter
upstream of the data-acq card,
which might make this a non-starter.
And this is not a homework problem,
(but could be for some nay-sayer).
I just needed a countervaling opinion
to tip the balance-of-merit towards science.
RE: nyquist vs averaging
For example, if your analog bandwidth is 500 MHz, and your sample rate is 100 MSPS, you will have ten nyquist zones within your analog bandwidth. You could look at data from any one of these zones by using the proper filter. For instance, an IF of 70 MHz (above nyquist), will fold down to 30 MHz. However, without a bandpass filter there will be ambiguity since signals at 30 MHz will also show up at 30 MHz, as well as signals at 130 MHz, 170 MHz etc. Therefore you will need a bandpass filter that has reasonable attenuation at all the frequencies that could alias into your desired output.
Peter