2016-12-29 02:57 AM
I have a 10DOF sensor that is calculating 3 angles(roll,pitch,yaw). I need to know how can i calculate the time of execution of my code so i can get the delta t, like in arduino we used as i remember a function called micros. I was thinking about using timer interrupt but i am trying to avoid it because i am not good with it.
2016-12-30 04:48 AM
Hi
Rabah.Mohamed
,To calculate execution period maybe you can useSysTick timer (STK) :
stop time
); (Systick timercounts down from the reload value to zero)tmp-=start time
;which
provide a tick value in millisecond-Nesrine-
2016-12-30 08:15 AM
For microsecond or nanosecond granularity I'd use the DWT_CYCCNT to benchmark code. 32-bit up counter as core frequency.
More practically consider using a free running 32-bit TIM, clocking at 1 MHz (or as desired) to provide a high resolution timeline.
2016-12-30 06:19 PM
i aim for 0.1 sec or maximum 0.01 so i would be glad if u explain to me what you mean.
2016-12-30 06:19 PM
Thanks. I will try it and update you
2016-12-30 07:20 PM
If you are unfamiliar with the DWT_CYCCNT I suggest you Google it, or the site here, I've posted multiple examples. ARM has Technical Reference Manuals describing the core.
If you clock one of the 32-bit TIM peripherals at 1MHz, reading TIMx->CNT will give you a ticking timestamp equivalent to micros() on Arduino
You can't interrupt at rates like 1MHz
2016-12-30 07:58 PM
Thanks for your help.