cancel
Showing results for 
Search instead for 
Did you mean: 

[Suggestion]when use hal apis, setting the timeout to 1ms is not safe !

cao wentao
Associate II
/**
  * @brief  Transmit an amount of data in blocking mode.
  * @param  hspi: pointer to a SPI_HandleTypeDef structure that contains
  *               the configuration information for SPI module.
  * @param  pData: pointer to data buffer
  * @param  Size: amount of data to be sent
  * @param  Timeout: Timeout duration
  * @retval HAL status
  */
HAL_StatusTypeDef HAL_SPI_Transmit(SPI_HandleTypeDef *hspi, uint8_t *pData, uint16_t Size, uint32_t Timeout)
/* Init tickstart for timeout management*/
  tickstart = HAL_GetTick();
/* Timeout management */
 if((Timeout == 0U) || ((Timeout != HAL_MAX_DELAY) && ((HAL_GetTick()-tickstart) >=  Timeout)))
{
   errorcode = HAL_TIMEOUT;
   goto error;
 }
  1. Setteing the timeout to 1ms, the actual value is <1ms. Some times the value is very close to 0, and leads to error.
  2. The value of timeout should be >= 2ms, but the instructions for use are not special.
  3. This issue still exists in the latest version STMCUBEMX V5.5.0. Hope to fix it in a future release.
  4. An effective solution is to replace "> = Timeout" with "> Timeout".
/* Timeout management */
if((Timeout == 0U) || ((Timeout != HAL_MAX_DELAY) && ((HAL_GetTick()-tickstart) >  Timeout)))
{
    errorcode = HAL_TIMEOUT;
    goto error;
}

1 ACCEPTED SOLUTION

Accepted Solutions
Piranha
Chief II

OS tick times are not for super-precise timings. Tick based delay will typically have a delay time of down to "x - 1 tick". That is done for overall system consistency. Consider this code with a tick frequency of 1000 Hz:

for (;;) {
	GPIO_Set1(LED);
	Delay(500);
	GPIO_Set0(LED);
	Delay(500);
}

The most correct code would blink with a 1000 ms period. Unfortunately in HAL_Delay() code they've put in "Add a freq (one tick) to guarantee minimum wait" code. Because of that, in this example HAL_Delay() will blink with a 1002 ms period.

HAL_MAX_DELAY constant is named wrong. It's used not for a maximum delay, but for infinite delay. Maximum delay is actually 0xFFFFFFFEu ms and minimum guaranteed delay is 2 ticks (not ms).

View solution in original post

6 REPLIES 6
JoniS
Senior

One should not use one millisecond timeout there, one interrupt between

/* Init tickstart for timeout management*/
  tickstart = HAL_GetTick();

and

/* Timeout management */
if((Timeout == 0U) || ((Timeout != HAL_MAX_DELAY) && ((HAL_GetTick()-tickstart) >  Timeout)))
{
    errorcode = HAL_TIMEOUT;
    goto error;
}

and it might timeout just for that.(most likely will timeout)

also i believe that checking the Timeout being 0 is not needed there, since

(Timeout == 0U)    and    ((HAL_GetTick()-tickstart) >=  Timeout))

will both evaluate true if timeout is actually 0, so its effectively checking that same thing twice. well, we can argue about that checking Timeout being 0 is faster than doing calculations, which might be the reason that it does exist, but then again that could be tested as very first thing in the function and just early exit with the error and not waste any clock cycles more than necessary.

Also what would happen if you call that function with time out being equal to HAL_MAX_DELAY, will the timeout system actually work then.(i think we can safely assume that HAL_MAX_DELAY is not equal to zero)

/* we cant timeout if Timeout == HAL_MAX_DELAY */
((HAL_GetTick() - tickstart) >=  Timeout) && (Timeout != HAL_MAX_DELAY))

do i miss something or is there actually pretty nasty bug in there? which can hang the whole MCU in infinity loop.

Thanks for for your reply.

#define HAL_MAX_DELAY      0xFFFFFFFFU

1.Setting the timeout to HAL_MAX_DELAYHAL_MAX_DELAY , the timeout system will stop working.

2.I think it's not a bug, instead ST designed it like this.

3.I guess ST wants to be designed to wait until it meets the conditions, if set the timeout to HAL_MAX_DELAYHAL_MAX_DELAY .

S.Ma
Principal

It is interpretation fuzzy.

In my case, when i set a delay, the real delay must be longer (roundup).

2 ways to do this

With 1us hw, set 10001 will do it

With 1msex hw, add 1msec unless you can start the timer right away.

Granularity, free run, jitter, related topic.

Piranha
Chief II

OS tick times are not for super-precise timings. Tick based delay will typically have a delay time of down to "x - 1 tick". That is done for overall system consistency. Consider this code with a tick frequency of 1000 Hz:

for (;;) {
	GPIO_Set1(LED);
	Delay(500);
	GPIO_Set0(LED);
	Delay(500);
}

The most correct code would blink with a 1000 ms period. Unfortunately in HAL_Delay() code they've put in "Add a freq (one tick) to guarantee minimum wait" code. Because of that, in this example HAL_Delay() will blink with a 1002 ms period.

HAL_MAX_DELAY constant is named wrong. It's used not for a maximum delay, but for infinite delay. Maximum delay is actually 0xFFFFFFFEu ms and minimum guaranteed delay is 2 ticks (not ms).

Thanks for for your reply. I agree with you.

1.Tick based delay will typically have a delay time of down to "x - 1 tick".

2.HAL_MAX_DELAY constant is named wrong.

3.Delay is 2 ticks (not ms).

Thanks for for your reply.