2024-03-06 03:17 AM
I'm trying to calculate the period of DMA ADC interrupt triggered when the destination buffer is completely full. As of my understanding the period is:
T_dmaDone = [(T_sample + T_conv) * sampleCount * channelCount] / F_adc
In my case each rank is set to T_sample = 41.5 cycles, the conversion is fixed to T_conv = 12.5 cycles, sample count is 100 per channel, channel (as well as ranks) count is 14, and F_adc = 12 MHz. The resulting period when DMA is done and buffer full is T_dmaDone = 6,3 ms. However, after inspecting the ADC_DMAConvCplt() and placing GPIO pin toggle directly, without additional processing done inside HAL_ADC_ConvCpltCallback() user callback, I measured T_dmaDone = 4,64 ms.
If I try to scale the measurement via multiplying samples taken per channel by a factor of 10, also the measured period scales linearly. Where is the error in my calculation? Is the provided equation valid?
2024-03-06 06:19 AM
The provided equation is valid. I do not see any errors in your calculation. Perhaps double check your ADC clock rate. Another calculation with max T_sample might be informative.