cancel
Showing results for 
Search instead for 
Did you mean: 

STM32H7 HAL_Delay twice as long

regjoe
Senior

Hello,

I'm here on H753, external clock, CPU clock configured to 400MHz. Setup code is generated by CubeMx.

If I execute a HAL_Delay() takes twice the time as expected, e.g. 2ms instead 1ms. I checked via GPIO toggle with a scope.

The MCO output is correct, so I guess the CPU clock is correct.

The system core clock variable is calculated/set to 400MHz after system clock initialization.

I remember I watched the same problem on another M7 chip, the SAME70.

I fixed it by patching the SYSTICK setup (HAL_SYSTICK_Config) but I'd like to know what is the problem here.

It seems to me, the SYSTICK is fed by 200MHz, not 400MHz as suggested in the CubeMx clock tree.

Does anybody have experienced this or can share information regarding SYSTICK?

Thanks,

Jochen

1 ACCEPTED SOLUTION

Accepted Solutions
TDK
Super User

HAL_Delay(1) will wait for between 1 ms and 2 ms to guarantee a minimum wait of 1 ms. This is the expected behavior.

It uses an underlying tick rate of 1 kHz so the resolution can only be +/- 1 ms based on when the tick happens.

 

If you need more precision, consider using a timer or DWT->CYCCNT.

 

 

/**
  * @brief This function provides minimum delay (in milliseconds) based
  *        on variable incremented.
  * @note In the default implementation , SysTick timer is the source of time base.
  *       It is used to generate interrupts at regular time intervals where uwTick
  *       is incremented.
  * @note This function is declared as __weak to be overwritten in case of other
  *       implementations in user file.
  *  Delay specifies the delay time length, in milliseconds.
  * @retval None
  */
__weak void HAL_Delay(uint32_t Delay)
{
  uint32_t tickstart = HAL_GetTick();
  uint32_t wait = Delay;

  /* Add a freq to guarantee minimum wait */
  if (wait < HAL_MAX_DELAY)
  {
    wait += (uint32_t)(uwTickFreq);
  }

  while ((HAL_GetTick() - tickstart) < wait)
  {
  }
}

 

If you feel a post has answered your question, please click "Accept as Solution".

View solution in original post

5 REPLIES 5
AScha.3
Super User

Hi,

you set the HSE to the real value, thats on your board ?

AScha3_0-1751795884617.png

ok ?

then read about delay...+/-1 tick:

http://www.efton.sk/STM32/gotcha/g13.html

Delay wrong? ...so do some test:

1. make 100ms hal delay, writing hi/lo to a pin, and check with scope : is 100ms +/- 1  ?

2. if wrong, use and set MCO , to some part of core clock

AScha3_1-1751795925448.png

..and check with scope. Then you know, what clock speed inside running.

If you feel a post has answered your question, please click "Accept as Solution".
TDK
Super User

HAL_Delay(1) will wait for between 1 ms and 2 ms to guarantee a minimum wait of 1 ms. This is the expected behavior.

It uses an underlying tick rate of 1 kHz so the resolution can only be +/- 1 ms based on when the tick happens.

 

If you need more precision, consider using a timer or DWT->CYCCNT.

 

 

/**
  * @brief This function provides minimum delay (in milliseconds) based
  *        on variable incremented.
  * @note In the default implementation , SysTick timer is the source of time base.
  *       It is used to generate interrupts at regular time intervals where uwTick
  *       is incremented.
  * @note This function is declared as __weak to be overwritten in case of other
  *       implementations in user file.
  *  Delay specifies the delay time length, in milliseconds.
  * @retval None
  */
__weak void HAL_Delay(uint32_t Delay)
{
  uint32_t tickstart = HAL_GetTick();
  uint32_t wait = Delay;

  /* Add a freq to guarantee minimum wait */
  if (wait < HAL_MAX_DELAY)
  {
    wait += (uint32_t)(uwTickFreq);
  }

  while ((HAL_GetTick() - tickstart) < wait)
  {
  }
}

 

If you feel a post has answered your question, please click "Accept as Solution".

I would have expected a jitter between 1 and 2ms, not always exactly 2m delay. But your hint regarding the resolution problem / minimum delay sound plausible. I'll do the 100ms test tomorrow and get back here. Thanks.

If you call it right after a tick happens, it’ll be 2ms consistently. For example, the second call here will always be 2 ms.

 

HAL_Delay(1);
HAL_Delay(1);


Or if it’s a loop with other stuff that happens nearly instantly, it’ll always be 2 ms.

while (1) {
    HAL_GPIO_TogglePin(...);
    HAL_Delay(1);
}

 

If you feel a post has answered your question, please click "Accept as Solution".

HAL_Delay(1) -> 2ms

HAL_Delay(10) -> 11ms

...

So this is the expected behavior, if used in a loop like shown above.

If most of the time a more precise delay of 1-10ms is required, shall I implement another timer, and put HAL_IncTick(); in there? Note that FreeRTOS is called by SysTick IRQ. Or implement an own delay function e.g. using DWT->CYCCNT?