cancel
Showing results for 
Search instead for 
Did you mean: 

12 Bits ADC precision issue

Karem
Associate

Hello,

We are working on a stm32f429.
The application uses ADC1 to sample data on Channel 0-7.

ADC1 and channels characteristics are:

- Mode: Independant
- Sampling clock: Around 2.8 Mhz
- No DMA access
- Sampling delay: 20 clocks periods (max value)
- Resolution: 12 bits
- No scan mode
- NO continuous conversion
- Single conversion mode
- Channel sampling time is 480 clock cyles (max value)
- Vref+ = Vdda = 3.3V
- VSS = 0V


Each channel is connected to a sensor with a voltage variation of 0-3.30V.

The signal on each channel is a low frequency.

The sampling period for all channel is 1 second.
Channels are sampled alternatively in the following order 0,1,2..7.

This works pretty well, with a very good precision for our application.

 

Now the specific issue we have is:

  • When the sensor voltage on channel N is >= 2V, The voltage measured by the ADC on Channel N+1
    is lower than the nominal value by 0.3V.
  • Only channel N+1 is affected all the other channels are not. And the change is always 0.3V.

Each channel has it owns conditionning chain independant from the others.


We have ruled out the conditionning chain as the source of the problem.
Because the input signal on Channel N+1 is stable when the voltage measured by the ADC is reduced by 0.3V.

Our Hypothesis:

We know that during each sampling phase, Channel N for example, the ADC
capacitors are charged up with the channel voltage: 3.3V for example.
At the end of the conversion, the capacitors are not discharged.

At the next sampling phase, Channel N+1, the new channel voltage is 1 V for example.
When the ADC sampling switch is closed, a charge transfer begins between the ADC
capacitors at 3.3V (Channel N value) and channel N+1 at 1V until equilibrium is reached or the adc
sampling time is reached.

If the sampling time is reached before the equilibrium, the ADC caps will holds
a voltage value different from the real channel value => which will produce an erroneous value.

To correct the problem, we have tested two solutions:

Solution 1: Using a dummy channel
We've set up an unused ADC channel tied to GND: let's say a dummy channel.
Then we sample this dummy channel before sampling any real channel.
This approach produce valid data on each channel and solves the problem.

 

Note: When the Dummy channel is tied to 3.3V, the problem remains.

 

Solution 2: Reducing the ADC frequency
By reducing the ADC frequency from 2.8 Mhz to 1.4 Mhz, The problem is solved.


We don't have a deep understanding on the problem at stake here. we have two question:
1 - Has anyone ever had this problem ?
2 - Is our hypothesis pointing to the problem root cause ?

 

1 REPLY 1
TDK
Super User

This is an issue with high impedance sources. It takes a while for the capacitor to be charged up.

See 3.2.7. Effect of the analog signal source resistance here:

How to optimize the ADC accuracy in the STM32 MCUs - Application note

Might be some secondary effects happening as well like charge current injection.

If you feel a post has answered your question, please click "Accept as Solution".