cancel
Showing results for 
Search instead for 
Did you mean: 

ADC with DMA fist time shows correct value, second time a lot lower. STM32L412 running on 1.8V

Linas L
Senior II

Hello,

I have ADC that reads battery voltage ( using 1024 samples with DMA, and after TC_IRQ I calculate average, it is stable to the bit !)

First time Trigger ADC i get 4.11V (perfect voltage), after 2s i trigger ADC gain, and i get 4.05V. and from that one it's stays at 4.05V. Voltage and reference is stable, i checked. (stepping with debugger)

DMA is in normal mode, so i get 1024 samples.

What is strange, if I use circular DMA, it holds value , but if I do any stop with debugger, anywhere , voltage goes to 4.05V.

Any idea why this can be happening?

1 ACCEPTED SOLUTION

Accepted Solutions

Ok, looks like I found problem.

I need to do ADC calibration just before I trigger ADC. and ADC must run without a delay in single calibration. If I do stop, i need to re-calibrate ADC.

in my ADC init code I had if condition to check if ADC calibration was performed, but it was done only one at start. So now I will do re-calibration each time i start burst ADC acquisition

View solution in original post

5 REPLIES 5

In some STM32 the ADC has a flaw, where are 1ms and longer delay the first ADC conversion is invalid. Check the errata to your STM32 model.

JW

Hello,

Indeed, where is such a problem on errata. But it's only if you do single measurements. I am making burst of 3x1024 ( 1024 of temperature, 1024 of battery voltage, and 1024 of sensor to get ultra-stable result)

If my first measurement would be incorrect, it would not give a big problem, since all other would pull it to correct value.0693W00000BaneaQAB.pngIts not much around 50 counts. (rank 1 Sensor, rank 2 battery (1.8Vref and battery is with 3 divider so 4.2V = 1.4V on adc, and rank 3 is temperature)

As I stated, when i use DMA in circular mode i get conversion happening all the time. if I just press run with debugger, i get higher voltage and correct value, but if I stop at some point and when allow it to run again, it drops too lower voltage and stays there. Very strange. :\

Ok, looks like I found problem.

I need to do ADC calibration just before I trigger ADC. and ADC must run without a delay in single calibration. If I do stop, i need to re-calibrate ADC.

in my ADC init code I had if condition to check if ADC calibration was performed, but it was done only one at start. So now I will do re-calibration each time i start burst ADC acquisition

Humm.

What's the content of ADC_CALFACT just before and after the bursts? Do you switch off/on clocks or ADC itself?

JW

Yes, I do complete ADC disable.

void ADC_DEACTIVATE(void)
{     
  uint32_t timeout = 0xFFFFF;
  ADC_MOSFET_STATE(0); //disable pulldown for resistance measurement
  if(LL_ADC_IsEnabled(ADC1) != 0)
  {
    LL_ADC_Disable(ADC1);
    while (LL_ADC_IsDisableOngoing(ADC1) != 0)
    {
      timeout--;
      if(timeout==0)
      {
        break;
      }
    }
  }
  LL_ADC_DisableInternalRegulator(ADC1);
  LL_ADC_EnableDeepPowerDown(ADC1);
  LL_ADC_ClearFlag_ADRDY(ADC1);
  LL_ADC_ClearFlag_EOC(ADC1);
  LL_ADC_ClearFlag_EOS(ADC1);
  LL_ADC_CommonDeInit(ADC12_COMMON);
  LL_ADC_DeInit(ADC1);
}

Ok, as usual it was problem between keyboard and chair. I was disabling ADC, but calibration status was not register, but global variable that was set first time and not second time. And my program was trying to recover for non working ADC by forcing re-enable ADC but as I stated before, I had global variable for calibration that was not reset after complete ADC deactivation. And my stepping with debugger just so happens that happened same time as my ADC recovery timeout kicked in.

So now all it's clear 🙂

Thank you for your time sir !