2

I was reading about AVR's ADC, and I came across this sentence

Now, the major question is… which frequency to select? Out of the 50kHz-200kHz range of frequencies, which one do we need? Well, the answer lies in your need. There is a trade-off between frequency and accuracy. Greater the frequency, lesser the accuracy and vice-versa. So, if your application is not sophisticated and doesn’t require much accuracy, you could go for higher frequencies.

Shouldn't accuracy increase when I'm sampling faster since I'm taking more samples from the signal? What am I missing here?

Thanks.

Ahmad Anwar
  • 123
  • 1
  • 8
  • Include a link for this citation for context. Is it talking about ADC clock frequency or _sample rate_. Typically sample rate is dictated by bandwidth requirements and the cut-off your your anti-aliasing filter rather the "accuracy". Moreover "accuracy" for a single voltage measurement is typically performed by oversampling and averaging multiple samples at a higher sample rate or using a box-car filter (moving average) to get a result at the sample frequency (but somewhat delayed by filter-width / 2). – Clifford Jan 11 '20 at 10:39

2 Answers2

3

Shouldn't accuracy increase when I'm sampling faster since I'm taking more samples from the signal?

No, since it takes more cycles to make a higher accuracy conversion. If the sampling rate is too fast, the sample-and-hold capacitors in the ADC won't have enough time to fill up. Hence you'll get a less reliable conversion sample.

So if you want to sample as fast as possible, you have to lower the resolution. In contrast, when you want as much precision as possible you have to lower the sample frequency to give the ADC enough time.

Swedgin
  • 815
  • 1
  • 11
  • 20
0

If you use an higher sample rate, you can eliminate some noise or increase precision by averaging. Take a look at:
https://www.nxp.com/docs/en/application-note/AN5250.pdf

https://www.avrfreaks.net/forum/adc-accuracy-sampling-rate-question