I think the discussion on noise has gotten off track. I may be wrong, but I think we are discussing several types of noise:
1. White noise from the source.
2. Deterministic noise from the source.
3. White noise from the ADC.
4. Deterministic noise from the ADC.
4. Deterministic noise from the microcontroller.
I think the ADC deterministic noise is carried in the specifications for DNL and INL (i.e. quantization noise). And I think there is very little white noise coming from the microcontroller.
Given all of the above, oversampling (which I think is really ensemble averaging) only increases the ADC resolution in the presence of white noise; either from the source, the ADC, or even the microcontroller if the noise exists.
Actually, SAR converters have a problem with oversampling. You don't get equal noise density of 1s and 0s right at the bit transitions; a requirement for averaging to work. This is not an issue for a delta-sigma ADC which is why oversampling is a term readily applied to delta-sigma converters. And why delta-sigma converters achieve higher bit resoluition at lower conversion rates. So there is an upper limit to "bit extension" if you ensemble average with a SAR.
There is a very interesting paper I read a while back that discusses the above in rote detail. If anyone is interested I can go back in and pullup the title.
Nevertheless, oversampling does not remove deterministic noise to the extent that the deterministic noise is below the averaging frequency.
This brings us to the last post about using a filter. In effect a filter is simply a "better" way to remove the deterministic noise than averaging. This is because the filter can have a much higer order of rolloff for the determinisitc element.
Given all of the above, can you buy a 16bitADC/microcontroller whose deterministic noise is below the LSB size?
I don't expect the ADC to solve the source noise issues or even the ADC/microcontroller white noise. This can be done to a limited extent by ensemble averaging. But if running the microcontroller simultaneously with the 16 bit ADC creates a deterministic noise within the ADC, I think I am screwed unless I know the noise spectrum of the code. If I could "deterministically" (is that a word?) account for the noise, then one could use a high order filter provided the noise doesn't come at a very low frequency (i.e. 1Hz clock in software).
Boy! That's a lot of writing! But, buried in there is a question.