cancel
Showing results for 
Search instead for 
Did you mean: 

Using HAL_Delay inside the DMA callback

AMahm.1
Associate II

Hi everyone, I'm new to MCUs and programming in general and currently trying to calculate the average of the plateaus from 3 ADC input channels. I'm using Exponential Smoothing to reduce the noise and an algorithm to detect the plateau. Even tho my signals are similar, there's a slight shift and that's always less than 3ms.

0693W00000FBjUwQAL.pngMy goal is to run the algorithm for only one channel and when it detects a plateau, it'd wait 3ms before it starts averaging for all the channels. Averaging would stop after some ms. I'm using HAL_Delay and HAL_GetTick for this purpose but it looks like I get different values than reality.

The noise reduction and averaging algorithm work for single-channel. There could be smarter ways to do this but I don't have much experience with this.

I'm looking for advice on this whole process.

1 ACCEPTED SOLUTION

Accepted Solutions
TDK
Guru

> Using HAL_Delay inside the DMA callback

You need to ensure SysTick has a higher priority (numerically lower) than whatever callback you're using HAL_Delay in.

> it looks like I get different values than reality.

In what way?

> I'm looking for advice on this whole process.

HAL_Delay(3) will wait somewhere between 3-4ms. It is not exact, as it increments once per ms and thus the resolution is poor. You could use the CYCCNT counter to implement a much more accurate delay.

Transferring many values at once, instead of just 3, will be more efficient. Maybe this doesn't matter if all you care about is the ADC. Using a timer to trigger ADC at exact intervals, transferring via DMA, and processing using the half- and full- complete interrupts is going to be more efficient.

The plateau-detection is going to be finicky.

If you feel a post has answered your question, please click "Accept as Solution".

View solution in original post

4 REPLIES 4
TDK
Guru

> Using HAL_Delay inside the DMA callback

You need to ensure SysTick has a higher priority (numerically lower) than whatever callback you're using HAL_Delay in.

> it looks like I get different values than reality.

In what way?

> I'm looking for advice on this whole process.

HAL_Delay(3) will wait somewhere between 3-4ms. It is not exact, as it increments once per ms and thus the resolution is poor. You could use the CYCCNT counter to implement a much more accurate delay.

Transferring many values at once, instead of just 3, will be more efficient. Maybe this doesn't matter if all you care about is the ADC. Using a timer to trigger ADC at exact intervals, transferring via DMA, and processing using the half- and full- complete interrupts is going to be more efficient.

The plateau-detection is going to be finicky.

If you feel a post has answered your question, please click "Accept as Solution".

> In what way?

I get lower voltage values than expected. My sensor runs/scans for a certain amount of time and the number of plateaus are also known. So, I'm guessing the delay function doesn't work with the current implementation. Because, when I run this for single-channel where the delay isn't necessary, I get expected values that also match with the values I read on the Oscilloscope or Multimeter. For single-channel, I don't use DMA but I'm not sure if DMA is the problem here.

Thanks for the suggestions on optimizing the efficiency. I thought about most of them but the implementation seems a bit complicated for me at this moment. Also, my application isn't sensitive to the delay resolution. I can afford to have 4ms delays as well.

> You need to ensure SysTick has a higher priority (numerically lower) than whatever callback you're using HAL_Delay in.

This sounds like the solution to my problem but currently, I have no idea how to change or check the priority. I have to look that up and maybe you can elaborate a little bit on that.

Thanks again!

TDK
Guru

Hard to troubleshoot everything at once. Break it down into pieces.

I would use HAL_Delay separately, toggle a pin that you can see on the scope to convince yourself it is working correctly.

Then work on the ADC on DC signals to make sure it's returning the values you expect.

Then work on ADC timing to ensure it's triggering at the expected sample rate.

Then integrate it all together.

If you feel a post has answered your question, please click "Accept as Solution".

Thanks for the suggestions. Actually the interrupt priority​ for SysTick and DMA Callback were the same and after lowering the DMA priority (higher numeric value) everything worked as expected. There're definitely other improvement possible in terms of efficiency as you suggested, but for my purpose it works for now.