cancel
Showing results for 
Search instead for 
Did you mean: 

delay us with stm32 F030F4 KIT

stark2
Associate II

Hi, I want to create delay us using stm32 F030F4 max CLK =48M, I based topic "Microsecond/Nanoseconds delay in STM32" of ControllerTech. but when I mesuare by logic alnalyzer, It is wrong! A nyone tell me why? Thanks!
12.png

48M.png

6 REPLIES 6
TDK
Guru
void DELAY_TIM_Us(TIM_HandleTypeDef *htim, uint16_t time)
{
	__HAL_TIM_SET_COUNTER(htim,0);
	while(__HAL_TIM_GET_COUNTER(htim)<time){}
}

Not the greatest scheme, but it should work ± overhead. Why do you think it's wrong?

If you feel a post has answered your question, please click "Accept as Solution".

How wrong? Perhaps express in units of time or frequency..

Perhaps your expectations or premise is wrong? Is the MCU actually running at the frequency in question. Could you print out the frequency and bus clocks to see what the MCU thinks its running at. Check HSE_VALUE matches what's clocking the MCU.

A TIM clocked at 1 MHz should be able to resolve within a micro-second or so. A faster clock will have finer granularity. A slower one will obviously be rougher. The MCU can't interrupt at excessively high rates, figure a few hundred KHz in this case, shrinks the more you do in the handler/callback

Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..
stark2
Associate II

delayus.pngwhen I use function "DS18B20_DelayUs(&DS1, 1);    // delay 1 us ". result in logic analyzer = 3.875us.
Do you know why is that?

when I use function "DS18B20_DelayUs(&DS1, 1);    // delay 1 us ". result in logic analyzer = 3.875us.
Do you know why is that?delayus.png

That is due to overhead and jitter in your timer. Calling functions takes times, returning from functions takes time, your timer has increment of 1us, so expect a jitter of 1us on top of the overhead.

You can get more accurate by using DWT->CYCCNT, reducing the function calls to a minimum, and compiling your code with optimizations on, but you will always incur some amount of overhead.

Do a 1s delay (or 1ms) and see if the actual delay is close to expected. Overhead will be minimal in that example so it will tell you if the timer is running at the expected rate.

If you feel a post has answered your question, please click "Accept as Solution".
Piranha
Chief II

https://controllerstech.com/create-1-microsecond-delay-stm32/

void delay_us (uint16_t us)
{
	__HAL_TIM_SET_COUNTER(&htim1,0);  // set the counter value a 0
	while (__HAL_TIM_GET_COUNTER(&htim1) < us);  // wait for the counter to reach the us input in the parameter
}

The code can easily be converted to be based on a free-running timer and the delay function would become reentrant. This just proves again that the ControllerTech site is made by a complete beginner without any real understanding of the software development.