cancel
Showing results for 
Search instead for 
Did you mean: 

How to convert ADC (with DMA) sample number to time without assuming uniform conversion time?

Kartik1004
Associate II

I'm new to working with ADCs on STM32. I'm reading a rectified voltage using ADC with DMA on an STM32G0B1. I know the ADC conversion time from the datasheet but I want to compare the ADC's output with readings from an oscilloscope. I'm plotting the voltage captured on the oscilloscope for 0.5 s and wish to compare this with the ADC output.

I'm using the ADC in continuous conversion mode with DMA. The clock prescaler is set to "Synchronous clock mode divided by 2" and the sampling time is 1.5 cycles. I'm storing samples in a 2000-length buffer.

However, on the project I'm working, I'm being told not to assume uniform conversion time between samples. If that would've been the case, I would've directly converted the sample numbers (i.e. indices) to units in time. Rather, I'm told to measure the conversion time between samples using a timer (say TIM16). I have tested the conversion time for the DMA mode on an oscilloscope by setting a GPIO pin HIGH when the conversion is complete.

How can I measure the conversion time between samples so that I can plot the oscilloscope voltage and ADC output on the same graph? I know the timescales of both graphs would differ, but the fundamental objective is to convert the sample number to a dimension in time without assuming uniform conversion time. Would measuring the conversion time b/w samples (in microseconds) for the ADC with DMA using timers be the optimal approach? If yes, between which functions/events in HAL_ADC_xx should I trigger a timer?

6 REPLIES 6

> However, on the project I'm working, I'm being told not to assume uniform conversion time between samples.

Continuous mode means, that the ADC starts a new conversion as soon as it finishes one.

Total time between consecutive samples in Continuous mode is given by sum of sampling time and conversion time. Both are derived from ADC clock and are specified in the manual.

What reason would be there to doubt that? I invite whomever told this to you to discuss this here.

JW

Thank you for your response. I discussed this with the supervisor concerned and they suggested practically verifying conversion times by storing them using HAL_GetTick() in a buffer and compute the conversion time for the i^th sample (based on the number of samples and a conversion factor from ticks to milliseconds). 

TDK
Guru

Conversion times are generally much finer than the resolution provided by HAL_GetTick, so that's a non-starter.

There is no notification at the start of a conversion that you could use, and even if you could, the overhead from interrupts or whatever code will introduce enough jitter to destroy whatever precision you wanted in the first place.

If you feel a post has answered your question, please click "Accept as Solution".
Nikita91
Lead II

It appears that your supervisor knows nothing about MCUs, ADCs and DMAs...

I pity you...

> There is no notification at the start of a conversion

If you can disconnect the ADC pin itself from the measurement circuit, try to connect it to GND or some small voltage through a relatively high resistor (say 10kOhm). Using oscilloscope, you then may be able to see the moments when conversion starts, as "dips" as the sampling capacitor starts to charge.

JW

Kartik1004
Associate II

Thank you @TDK  and @waclawek.jan for the insights! We ended up trusting the internal ADC Clock.