cancel
Showing results for 
Search instead for 
Did you mean: 

Calculate ADC DMA Buffer Full Period

lgacnik97
Associate III

I'm trying to calculate the period of DMA ADC interrupt triggered when the destination buffer is completely full. As of my understanding the period is:

T_dmaDone = [(T_sample + T_conv) * sampleCount * channelCount] / F_adc

In my case each rank is set to T_sample = 41.5 cycles, the conversion is fixed to T_conv = 12.5 cycles, sample count is 100 per channel, channel (as well as ranks) count is 14, and F_adc = 12 MHz. The resulting period when DMA is done and buffer full is T_dmaDone = 6,3 ms. However, after inspecting the ADC_DMAConvCplt() and placing GPIO pin toggle directly, without additional processing done inside HAL_ADC_ConvCpltCallback() user callback, I measured T_dmaDone = 4,64 ms.

If I try to scale the measurement via multiplying samples taken per channel by a factor of 10, also the measured period scales linearly. Where is the error in my calculation? Is the provided equation valid?

1 REPLY 1
TDK
Guru

The provided equation is valid. I do not see any errors in your calculation. Perhaps double check your ADC clock rate. Another calculation with max T_sample might be informative.

If you feel a post has answered your question, please click "Accept as Solution".