Hello there,

I am having problem calculating the exact conversion time of ADC when using DMA ring buffering.

I was trying to find it in the STM32F4 datasheet but didnt find it.

I was trying to deduce it out using scope (toogle a pin each time interrupt is handled) but it doesnt seem right and it seem to be not linear.

My system clock is 168 Mhz. I believe it gives 48 Mhz peripheral clock. I am using 12 bit adc which is 15 cycles. Now when I start the DMA ADC cycle to measure only 1 time I get 5.7 us time. For 10 measures I get 11.4 us and 28.4 us for 20.

Is there an equasion to be able to calculate it?

I am having problem calculating the exact conversion time of ADC when using DMA ring buffering.

I was trying to find it in the STM32F4 datasheet but didnt find it.

I was trying to deduce it out using scope (toogle a pin each time interrupt is handled) but it doesnt seem right and it seem to be not linear.

My system clock is 168 Mhz. I believe it gives 48 Mhz peripheral clock. I am using 12 bit adc which is 15 cycles. Now when I start the DMA ADC cycle to measure only 1 time I get 5.7 us time. For 10 measures I get 11.4 us and 28.4 us for 20.

Is there an equasion to be able to calculate it?

I'm not a fan of grinding the ADC like that, but rather pacing it with a timer and a rate I choose.