cancel
Showing results for 
Search instead for 
Did you mean: 

Best ADC accuracy that can be obtained with a Nucleo board or similar.

Johi
Senior III

I am using STM32F407 and STM32F767ZI on Nucleo and other boards. My configuration is a simple potentiometer as voltage divider (200K or 1K) to generate the analog input voltage.

Even with stable voltage on the ADC input (verified with my fluke 117 & Picoscope 5000, 2,000V, delta < 5mV), the values read on the ADC vary over ranges of about 100mV (0,117mV).

My question:

1) What is a typical accuracy one can expect on such a simple configuration.

2) What is the major contributor to this inaccuracy, and what can be done about it.

Note:

Detailed analysis added as an attachement.

1 ACCEPTED SOLUTION

Accepted Solutions
Johi
Senior III

Changing from PA0 to PA3 on the Nucleo board solved my problem. This confirms the noise hypothesis as indicated in the previous posts. Standard deviation went from 0.0025 (10 times better than before). This makes sense as the PA3 (A0) CN9 on the Nucleoboard is a pin that is used on the Arduino as analog input, and routed appropriately and the PA0 as CN11 P28 is probably not.

View solution in original post

19 REPLIES 19
magene
Senior II

If I understand your detailed analysis, it looks like you tried digitizing the reference voltage and got pretty decent results: 5 mV of noise. You may want to try disconnecting your input signal and just grounding the input right on the board and see what you get. If that gives a low noise result you may want to think about how your input signal is referenced to the ADC input.

Do you have your voltage divider connected across the VDD of the Nucleo board? If so, where does VDD come from? A USB connector from your PC? That's probably a pretty noise signal. A quick and easy thing to try is connect the ground side of your voltage divider to whatever the ADC is using for its ground reference just in case the Nucleo board designer didn't. You'll have to check the schematic.

Here are some basic things you might want to check.

  1. How fast are you digitizing? Does the noise decrease if you digitize slower? It looks like you're code just loops as fast as the processor allows. Try putting a 100 millisecond delay in the loop just to see what happens.
  2. Are you sure your scope is set up for high speed sampling (100-1000X whatever rate you're digitizing at) with no filtering so you really tell if there's high frequency noise on your signal? I'm guessing the Fluke is making a DC measurement and won't respond to any high frequency noise that might be on your signal and the ADC might be seeing. You might want to try setting your scope to AC coupled.
  3. Is your ADC set up for unipolar or bipolar input? If it's bipolar your theoretical LSB resolution is half what it would be in unipolar mode. If it's unipolar and the ADC had a 3.3V reference and 12 bits of resolution your LSB resolution should be about 3.3/2^12 = 3.3/4096 = 0.85 mV. So the 5 mV of noise you see when you digitize the Vref is already 2 or 3 bits. Regardless, 100 mV of noise does definitely seem like too much.
  4. Is your ADC input setup for differential or single ended? If it's differential you really have to have a way to make sure you're signal is inside the common mode range of the ADC. There are lots of internet articles on this issue.
  5. Try setting the voltage divider for an output just over 0.0 volts, just below the voltage that saturates the ADC and somewhere in between and compare the noise.
  6. BTW, I took a quick eyeball look at the data in your analysis. It isn't really all that bad. try calculating the standard deviation around the mean value. If you really want to get fancy, plot a histogram of the data. An occasional high or low value on your bench without decent grounding and shielding isn't totally unexpected.

Good luck.

raptorhal2
Lead

What is the single sample time in clock cycles ?

Johi
Senior III

@raptorhal2​  the frequency of the board is 16 Mhz, my sampling code waits until the ADC has completed its cycle:

		  HAL_ADC_Start(&hadc1);
		  HAL_ADC_PollForConversion(&hadc1, HAL_MAX_DELAY);
		  int raw = HAL_ADC_GetValue(&hadc1);

APB clock sits at 8 Mhz, well below 14 Mhz.

So the exact sample time in clock cycles I do not know, but I think it should be sufficient.

Johi
Senior III

@magene​ ,

Much appreciation for your reply to my post. I am working on a detailed answer with statistical analysis of the measurements. I am also in the process of purchasing a lab DMM so I can measure the signals and the stability of the power supply signal. Once I have the data I will get back to you.

On top of this I have been reading about true 12 bits. One of the topics in the article is that one needs to separate the digital from the analog part in order to get real 12 bits accuracy if one has a SAR ADC.

So I wonder what the effect of the integration of the 12 bits ADC in the microcontroller is?

Since the signal source is a high impedance, best accuracy requires adequate time for the sample and hold capacitor in the ADC to charge up after switching to that channel and before conversion starts on that channel. For each channel rank configuration, there is a sample time setting in clock cycles and ST's examples usually set it to 1.5 cycles, too low for a pot signal. If you setting is low, try substantially increasing it.

AScha.3
Chief III

for reading a high impedance source like a pot with these kind of adc , simply need to have a big enough cap with low impedance (ceramic cap) direct at input to GND. then you will see, how good these adc can be. and set sampling time to 7.5 or so, not the absolute minimum.

so solder a 100nF cer.cap (X7R) to the input and then tell us...

If you feel a post has answered your question, please click "Accept as Solution".

Ok, thanks for the tip of the cycle time, I will take it along in my tests. It is in line with my observation that smaller pot gives better results. Connecting input to ground is also interesting in the same line (as you proposed). For the rest I will use my lab power supply and not a potmeter. I checked the voltage dip during sampling on the input with my scope, but indeed, could be that the capacitor of the ADC is not charged correctly and this voltage cannot be observed measuring the input. Nice!

Johi
Senior III

@magene​ ​ @AScha.3​ 

I took many of Your advices into account and made a detailed case note attached to this post.

Overall conclusion:

1) Used caps at supply side of voltage divider and between GND and AI => no solution.

2) Grounded analog input of AI => still > 25 mV of std deviation?

3) Changed digitizing speeds (ADC_SAMPLETIME_480CYCLES vs ADC_SAMPLETIME_3CYCLES) => no difference.

4) Used Arduino Mega2560 to check hardware: 2.5 mV std deviation <> 37 mV for the STM32H767Zi Nucleo (F407VET6 has the same or very similar results).

5) I did not find the option to set my ADC bipolar, possibly it is not supported by my MCU, differential support I also did not come across?

The difference between the STM (12 bits) and the ATMEL (10 bits) measurements is still very large?

I do believe I missed something but what?

AScha.3
Chief III

just for fun- i made some simple read adc tests, on H743 : get without oversampling about 0...20 digits error, at 16 bit , asym. in, and 3v3 and 40cm line at input -> about 0,5mV noise/error . not that bad. :)

next: try with symm input and short+soldered connections.

If you feel a post has answered your question, please click "Accept as Solution".