Timing problem with ADC_DelayMicroSecond(uint32_t microSecond)
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2016-09-22 7:02 AM
Posted on September 22, 2016 at 16:02
This HAL code seems to run 12x slower then expected.
/** * @brief Delay micro seconds * @param microSecond : delay * @retval None */static void ADC_DelayMicroSecond(uint32_t microSecond){ /* Compute number of CPU cycles to wait for */ __IO uint32_t waitLoopIndex = (microSecond * (SystemCoreClock / 1000000U)); while(waitLoopIndex != 0U) { waitLoopIndex--; } }Peter
This discussion is locked. Please start a new topic to ask your question.
2 REPLIES 2
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2016-09-22 7:28 AM
Posted on September 22, 2016 at 16:28
You mean its not a unit cycle machine?? Someone tell the Java coders...
There are better ways to do this, but how is it being used? If it is just being used to sequence the initialization of the ADC, and to insert at least enough dwell time to meet it's goals anything >=1X should suffice. One could use DWT_CYCCNT to get to sub-microsecond granularity.
Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..
Up vote any posts that you find helpful, it shows what's working..
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2016-09-22 9:32 AM
Posted on September 22, 2016 at 18:32Hi Clive,Thanks for your reply.On closer examination the division code adds more then 100us in my processor.Is there a suitable static inline code that can be used?The TI430 compiler had an intrinsic __delay_cycles() that inserted suitable NOPs and/or loops;.So far, my benchmarking of HAL code is not going well ....Peter
