cancel
Showing results for 
Search instead for 
Did you mean: 

How to avoid delay with reading adc values in STM32?

RShre.2
Associate III

I am using STM32-G07RB and reading the sine signals passed from a LPF circuit. There is a delay between the signals directly from the circuit and when passed through the MCU (adc port) to dac for oscilloscope display. The adc is triggered by the timer interrupt which has a sampling rate of 50KHz and the adc resolution is 12 bits. I see that with the increase in sampling rate the delay decreases, however, in my case the limitation to using sampling rate is 50 KHz for my project. So cannot go beyond. Might have to even lower it with few added tasks for my project. I read that the conversion time and sampling time for 12 bit is 12.5adc cycles and 2.5 cycles respectively. I am not sure how it exactly affected my signal reading, but I assume that the conversion and sampling time has something to do with the delay. I thought of decreasing the resolution so that I can increase the speed if that would make any sense but the amplitude of my signal is quite small, around 50mV so I am not sure if the lower resolution would hamper anything. Is there a way to make the adc value reading: delay increased with signal frequency. 

In the figure, the red one is from the MCU dac while the blue one is directly from the circuit. 

4 REPLIES 4
AScha.3
Chief II

you cannot " avoid delay"  - you just can it make as short as possible (with chosen adc/cpu) .

minimum delay would be about 0,5us adc, 0,5 cpu move data etc. , 0,5 dac. loop. so about 2 us delay should be possible, if do no calculations on the cpu. but what is it good for ? with some filtering or other processing the data, delay will naturally increase.

If you feel a post has answered your question, please click "Accept as Solution".
KnarfB
Principal III

Conversion and sampling times should be a no-brainer for you, 2.5 Msamples/s are possible. ADC can have its own clock (PLLP) so you can test various settings of ADC and CPU clocks and check their influences. The filter length will also determine the delay. And the filter implementation (unintended float emulation?, trigonometric functions,...)

hth

KnarfB

DAC1->DHR12R1 = u_ref_1;  //sent to the circuit
u_adc_1 = HAL_ADC_GetValue(&hadc1);
 
DAC1 -> DHR12R2 = u_adc_1;
No matter how long my code is, if i am reading adc as soon as I sent to the circuit, it should have the lowest possible delay, right? 

I am getting the delay of 30us. 2us would have been understandable. 
 
I understand that the filter adds delay but here the two signals analyzed are both adc (passed through the filter). Just that one is directly from circuit, and the other passed to MCU then to DAC. They both should have the same phase delay. 

>No matter how long my code is

no. for first test - speed : no code , except : while loop( adc_get -> dac_set ) . 

and adc setting : clock? 

then you see max speed (with HAL call, maybe this need some 1..2us alone (hal is not the fastest on planet ))

If you feel a post has answered your question, please click "Accept as Solution".