2020-01-16 06:03 PM
/**
* @brief Transmit an amount of data in blocking mode.
* @param hspi: pointer to a SPI_HandleTypeDef structure that contains
* the configuration information for SPI module.
* @param pData: pointer to data buffer
* @param Size: amount of data to be sent
* @param Timeout: Timeout duration
* @retval HAL status
*/
HAL_StatusTypeDef HAL_SPI_Transmit(SPI_HandleTypeDef *hspi, uint8_t *pData, uint16_t Size, uint32_t Timeout)
/* Init tickstart for timeout management*/
tickstart = HAL_GetTick();
/* Timeout management */
if((Timeout == 0U) || ((Timeout != HAL_MAX_DELAY) && ((HAL_GetTick()-tickstart) >= Timeout)))
{
errorcode = HAL_TIMEOUT;
goto error;
}
/* Timeout management */
if((Timeout == 0U) || ((Timeout != HAL_MAX_DELAY) && ((HAL_GetTick()-tickstart) > Timeout)))
{
errorcode = HAL_TIMEOUT;
goto error;
}
Solved! Go to Solution.
2020-01-25 11:27 AM
OS tick times are not for super-precise timings. Tick based delay will typically have a delay time of down to "x - 1 tick". That is done for overall system consistency. Consider this code with a tick frequency of 1000 Hz:
for (;;) {
GPIO_Set1(LED);
Delay(500);
GPIO_Set0(LED);
Delay(500);
}
The most correct code would blink with a 1000 ms period. Unfortunately in HAL_Delay() code they've put in "Add a freq (one tick) to guarantee minimum wait" code. Because of that, in this example HAL_Delay() will blink with a 1002 ms period.
HAL_MAX_DELAY constant is named wrong. It's used not for a maximum delay, but for infinite delay. Maximum delay is actually 0xFFFFFFFEu ms and minimum guaranteed delay is 2 ticks (not ms).
2020-01-17 07:00 AM
One should not use one millisecond timeout there, one interrupt between
/* Init tickstart for timeout management*/
tickstart = HAL_GetTick();
and
/* Timeout management */
if((Timeout == 0U) || ((Timeout != HAL_MAX_DELAY) && ((HAL_GetTick()-tickstart) > Timeout)))
{
errorcode = HAL_TIMEOUT;
goto error;
}
and it might timeout just for that.(most likely will timeout)
also i believe that checking the Timeout being 0 is not needed there, since
(Timeout == 0U) and ((HAL_GetTick()-tickstart) >= Timeout))
will both evaluate true if timeout is actually 0, so its effectively checking that same thing twice. well, we can argue about that checking Timeout being 0 is faster than doing calculations, which might be the reason that it does exist, but then again that could be tested as very first thing in the function and just early exit with the error and not waste any clock cycles more than necessary.
Also what would happen if you call that function with time out being equal to HAL_MAX_DELAY, will the timeout system actually work then.(i think we can safely assume that HAL_MAX_DELAY is not equal to zero)
/* we cant timeout if Timeout == HAL_MAX_DELAY */
((HAL_GetTick() - tickstart) >= Timeout) && (Timeout != HAL_MAX_DELAY))
do i miss something or is there actually pretty nasty bug in there? which can hang the whole MCU in infinity loop.
2020-01-18 08:58 PM
Thanks for for your reply.
#define HAL_MAX_DELAY 0xFFFFFFFFU
1.Setting the timeout to HAL_MAX_DELAYHAL_MAX_DELAY , the timeout system will stop working.
2.I think it's not a bug, instead ST designed it like this.
3.I guess ST wants to be designed to wait until it meets the conditions, if set the timeout to HAL_MAX_DELAYHAL_MAX_DELAY .
2020-01-19 03:06 AM
It is interpretation fuzzy.
In my case, when i set a delay, the real delay must be longer (roundup).
2 ways to do this
With 1us hw, set 10001 will do it
With 1msex hw, add 1msec unless you can start the timer right away.
Granularity, free run, jitter, related topic.
2020-01-25 11:27 AM
OS tick times are not for super-precise timings. Tick based delay will typically have a delay time of down to "x - 1 tick". That is done for overall system consistency. Consider this code with a tick frequency of 1000 Hz:
for (;;) {
GPIO_Set1(LED);
Delay(500);
GPIO_Set0(LED);
Delay(500);
}
The most correct code would blink with a 1000 ms period. Unfortunately in HAL_Delay() code they've put in "Add a freq (one tick) to guarantee minimum wait" code. Because of that, in this example HAL_Delay() will blink with a 1002 ms period.
HAL_MAX_DELAY constant is named wrong. It's used not for a maximum delay, but for infinite delay. Maximum delay is actually 0xFFFFFFFEu ms and minimum guaranteed delay is 2 ticks (not ms).
2020-01-25 03:31 PM
Thanks for for your reply. I agree with you.
1.Tick based delay will typically have a delay time of down to "x - 1 tick".
2.HAL_MAX_DELAY constant is named wrong.
3.Delay is 2 ticks (not ms).
2020-01-25 03:34 PM
Thanks for for your reply.