Showing results for 
Search instead for 
Did you mean: 

SRM32G030C6T6 : Delay in ADC conversion (Scan mode with DMA)



I have setup the ADC to operate in scan mode, there are two analog channels to be converted. The CPU is running at 64 MHz and the ADC is running at 32 MHz. I am using the void HAL_ADC_ConvCpltCallback(ADC_HandleTypeDef* hadc) to start processing the ADC data.

The ADC is triggered of the timer3 interrupt, which is on a 70 uS cadence. I do a little bit of work in the timer ISR for a few uS. The callback happens 30 uS after the timerIsr invocation ! I expected it to be much shorter in the order of a uS. What am I missing ?

The code is bare metal.

Appreciate any pointers.



Chief II

Why you mean shorter ? Timer ISR define START ADC, but how long one adc in scan took is defined in channel setup. And after your two channs is scanned ISR about complete occurr.


> The callback happens 30 uS after the timerIsr invocation

HAL_ADC_ConvCpltCallback is called via an ADC or DMA interrupt, not a timer interrupt.

If the timer triggers the ADC, then the ADC still needs to convert everything, after which it will call the ISR when finished. This takes time.

If you feel a post has answered your question, please click "Accept as Solution".
Associate II

We are facing the same problem. CPU STM8G030T6, clock is 64 MHz, ADC clk is 16 MHz. We are scanning 9 ADC channels with LL_ADC_SAMPLINGTIME_3CYCLES_5. We are starting conversion using  LL_ADC_REG_StartConversion(ADC1) every 31.25 usec. Conversion time of each channel should be (3.5+12.3)/16 = 1usec. This is OK, however the first conversion occurs about 6.16 usec after LL_ADC_REG_StartConversion(ADC1) !?! Can anyone explain why this delay?