cancel
Showing results for 
Search instead for 
Did you mean: 

Timing problem with ADC_DelayMicroSecond(uint32_t microSecond)

Dvorak.Peter
Senior II
Posted on September 22, 2016 at 16:02

This HAL code seems to run 12x slower then expected.

/**

  * @brief  Delay micro seconds 

  * @param  microSecond : delay

  * @retval None

  */

static void ADC_DelayMicroSecond(uint32_t microSecond)

{

  /* Compute number of CPU cycles to wait for */

  __IO uint32_t waitLoopIndex = (microSecond * (SystemCoreClock / 1000000U));

  while(waitLoopIndex != 0U)

  {

    waitLoopIndex--;

  } 

}

Peter

2 REPLIES 2
Posted on September 22, 2016 at 16:28

You mean its not a unit cycle machine?? Someone tell the Java coders...

There are better ways to do this, but how is it being used? If it is just being used to sequence the initialization of the ADC, and to insert at least enough dwell time to meet it's goals anything >=1X should suffice.

One could use DWT_CYCCNT to get to sub-microsecond granularity.

Tips, buy me a coffee, or three.. PayPal Venmo Up vote any posts that you find helpful, it shows what's working..
Dvorak.Peter
Senior II
Posted on September 22, 2016 at 18:32

Hi Clive,

Thanks for your reply.

On closer examination the division code adds more then 100us in my processor.

Is there a suitable static inline code that can be used?

The TI430 compiler had an intrinsic  __delay_cycles() that inserted suitable NOPs and/or loops;.

So far, my benchmarking of HAL code is not going well ....

Peter