cancel
Showing results for 
Search instead for 
Did you mean: 

Auto Delay ADC STM32F3

schellek
Associate II

Hello Community,

I'm currently programming a framework for the STM32F3 ADC. However I ran into a problem. I want my ADC to run continuously. Moreover I expect "long" delays between reading the channels. I thought, this is no problem, since the ADC has the Auto Delay functionality. I attached a similar program flow and like my real program, the EOS flag is always set after the delay and only buffer[0] is written. I already searched the reference manual as crazy. Can someone see my fault?

Regards

8 REPLIES 8
TDK
Guru

Does it work if you take out the HAL_Delay(1); line? Probably all the conversions are overwriting each other. You need to use DMA for multiple channels, or be more responsive to the EOC flag.

If you feel a post has answered your question, please click "Accept as Solution".
schellek
Associate II

Hmm, no it doesn't actually. It worked as I used (EOC-) interrupts for it, however I want this to be independent from DMAs and Interrupts

schellek
Associate II

0693W000003Bh0wQAC.png

This code works. But I really don't get it. I thought the Auto Delay waits with the next conversion until the DR is read (EOC flag cleared)

Don't you observe it in debugger?

JW

schellek
Associate II

I do, and the desired cfgr bits are set, however the ADC isn't stopped by the debugger. So when i break the execution and refresh the DR register, it always holds another value

TDK
Guru

This figure suggests your original logic would work just fine:

0693W000003BjmxQAC.png

Probably something else you're missing.

I wish the F4 had the same functionality.

If you feel a post has answered your question, please click "Accept as Solution".

>> Don't you observe it in debugger?

> I do

Okay so that's why the conversion doesn't stay in the DLY state - you read out ADC_DR by the debugger, thus clearing EOC. Exactly as in the "classical" case of UART and SPI I am talking in that link, and discussed here so many times.

Stop observing ADC by debugger, observe only the memory where the program writes the data.

JW

schellek
Associate II

That was also my guessing. Since it‘s a framework or a driver, it shall be debuggable. I‘m currently try to make a workaround with the discontinuous mode with n=1 and start the adc everytime after I read the reg. However, at a certain time, it doesn’t work anymore. But this is a subject for another thread 😉

thank you both anyway 🙂