2018-11-17 10:01 AM
Hi,
I'm working on a project to create a voltage inverter controlled by a STM32103C8T6 (the bluepill). Later, I intend to include a boost converter to have a better voltage control.
The problem is that my delays aren't working properly. Where I should get 7ms, I get 8ms, and where I should get 1ms, I get 2ms.
In the infinite loop is
GPIOA->BSRR = GPIO_PIN_9; // Bit 9
HAL_Delay(7);
GPIOA->BRR = GPIO_PIN_9;
HAL_Delay(1);
GPIOA->BSRR = GPIO_PIN_10; // Bit 10
HAL_Delay(7);
GPIOA->BRR = GPIO_PIN_10;
HAL_Delay(1);
In the inverter's output, I'm getting 50Hz when I should get 62.5Hz approximately.
The complete code is available in https://gist.github.com/adailtonjn68/b80a8f2370d5024dc6b750ffe0dad0b8
I'm configuring the MCU to run at 72MHz, the basic project is generated in CubeMX, and I use CubeProgrammer to program the MCU through ST-Link V2.
It is important to mention that I am a newbie in STM32 programming. My experience I have is programming Arduinos, PIC16F and PIC12F. I do think that I'm missing something. What is it? Please, help me.
Solved! Go to Solution.
2018-11-17 11:34 AM
If you really need accurate timing, consider using a timer.
To answer your question directly, the behavior you observe is exactly as HAL_Delay is implemented. The documentation for the function is rather questionable. This is from stm32l4xx_hal.c:
/**
* @brief This function provides minimum delay (in milliseconds) based
* on variable incremented.
* @note In the default implementation , SysTick timer is the source of time base.
* It is used to generate interrupts at regular time intervals where uwTick
* is incremented.
* @note This function is declared as __weak to be overwritten in case of other
* implementations in user file.
* @param Delay specifies the delay time length, in milliseconds.
* @retval None
*/
__weak void HAL_Delay(uint32_t Delay)
{
uint32_t tickstart = HAL_GetTick();
uint32_t wait = Delay;
/* Add a period to guaranty minimum wait */
if (wait < HAL_MAX_DELAY)
{
wait++;
}
while((HAL_GetTick() - tickstart) < wait)
{
}
}
Note the "wait++;"
The inline documentation appears to have been written by someone who was not fluent in English.
Hopefully every compiler will optimize away the needless copying of the parameter.
This stuff would be fixed very quickly if ST would ever post their HAL in a public repo and accept pull requests from their customers.
2018-11-17 11:34 AM
If you really need accurate timing, consider using a timer.
To answer your question directly, the behavior you observe is exactly as HAL_Delay is implemented. The documentation for the function is rather questionable. This is from stm32l4xx_hal.c:
/**
* @brief This function provides minimum delay (in milliseconds) based
* on variable incremented.
* @note In the default implementation , SysTick timer is the source of time base.
* It is used to generate interrupts at regular time intervals where uwTick
* is incremented.
* @note This function is declared as __weak to be overwritten in case of other
* implementations in user file.
* @param Delay specifies the delay time length, in milliseconds.
* @retval None
*/
__weak void HAL_Delay(uint32_t Delay)
{
uint32_t tickstart = HAL_GetTick();
uint32_t wait = Delay;
/* Add a period to guaranty minimum wait */
if (wait < HAL_MAX_DELAY)
{
wait++;
}
while((HAL_GetTick() - tickstart) < wait)
{
}
}
Note the "wait++;"
The inline documentation appears to have been written by someone who was not fluent in English.
Hopefully every compiler will optimize away the needless copying of the parameter.
This stuff would be fixed very quickly if ST would ever post their HAL in a public repo and accept pull requests from their customers.
2018-11-17 12:09 PM
A SysTick running at 1ms granularity is going to yield +/-1ms accuracy, want better, use a finer grade timer.
DWT_CYCCNT can provide processor cycle count, agnostic to interrupt loading/priority.
2018-11-18 10:07 AM
Thanks for the answers.
I set up a timer to do the work and I got a really good precision.