cancel
Showing results for 
Search instead for 
Did you mean: 

delay function based on timer, overflow safe?

Benjamin Brammer
Senior II

Hello Everybody,

I was thinking about a flexible way to implement a µs or ms delay function with a free running timer as tick source. One of the biggest problems that comes to my mind is the overflow problem when the timer wraps around, so that some classical approaches, I have found on the Internet would lead to false behaviour.

I have thouight of the following solution for the problem and would like some feedback from the community:

I assumed using a timer with a 16-Bit counter but you could also use a 32-Bit counter.

/**
 * @brief	µs delay function based on Timer15.
 * 			Maximum 65536µs delay possible.
 * @param	ucnt µs to delay
 * @return	none
 */
void delay_TIM15(uint16_t ucnt)
{
	uint16_t start = TIM15->CNT;
	while((start + ucnt) > tim15_tick());
}
#define tim15_tick()					(TIM15->CNT)

in my solution an overflow or timer wrap around would be no problem since the sum of start + ucnt is stored inside an uint16_t variable and would do the same wrap around as the counter. Or am I wrong?

best regards

Benjamin

14 REPLIES 14
Piranha
Chief II
start + ucnt

When this wraps-around, the target time will be smaller than the current counter and loop will be skipped immediately. And there is no sense of adding these in the loop condition, because these don't change in or during the loop.

For unsigned integers C language guarantees correct arithmetic on wrap-around. For example, for unsigned 16-bit values in C:

2 - 0xFFFF == 3

Therefore overflow safe version is as simple as this:

void delay_TIM15(uint16_t ucnt)
{
	uint16_t start = TIM15->CNT;
	while ((uint16_t)(tim15_tick() - start) < ucnt);
}

And the (uint16_t) cast is absolutely critical here!

Hey Jan,

thanks for the answer and information. Off course Piranha is right with his post:see_no_evil_monkey: . I am a bit embarrassed that this clear mistake did not come right to me mind.

As for the solution, I really did not know that C-language does correct arithmetic on wrap around.

Benni

Iam not sure if I have understood you right:

You would suggest I use a timer to count every µs and use the timer interrupt to generate a ms intervall which I can freely use to count different counting variables and so could easily generate multiple of ms delays, right?

The reason I would do this, that I would rather not leave the CPU doing nothing for so long periods like ms, right?

Thanks Piranha for your answer and the clearification, as for my question from the start, this definetely answered it!

Benjamin Brammer
Senior II

To everybody:

thanks for your answers and suggestions!