cancel
Showing results for 
Search instead for 
Did you mean: 

Cycle counter accuracy with PLL and HSE Clock for STM32L4

VChav.19
Associate II

I am currently designing an application where I will read Sensors that can be integrated over time (e.g. Accelerometer, flow meter). As a first step I want to calculate whats the maximum error I could get by using the cycle counter as a unit of time.

For example I have a 24 MHZ external clock (50ppm) and use HSE that will give me a CPU frequency of 80 MHZ. When I use the DWT Cycle counter I can calculate execution time between functions for example.

Lets say that I have an integration function that is called approx each 1 millisecond, of course due to external factors (e.g. execution time of other functions,interrupts,etc) the function will not be called exactly each 1 millisecond. With the DWT Cycle counter I can accurately calculate the relative time the last time the integration function was called.

For thise case the unit of time for the cycle counter is 12,5 nanoseconds (1/80MHz). My question is how can I calculate the accuracy of these 12,5 nanoseconds due to error of the ppm stability and the STM32 PLL accuracy.

I basically want to calculate what is the minimum time I can integrate without losing accuracy or at least calculate whats the deviation I can get while integrating over time.

Thanks in advanced for passing by

4 REPLIES 4

The long-time accuracy of PLL-based clock is the same as the accuracy of its driving clock, i.e. the HSE crystal or oscillator. PLL adds jitter to it.

In other words, assuming absolutely perfect HSE clock of K MHz followed by PLL multiplying by N, if you count the number of cycles on the output of PLL within one second, it will be 1E6*K*N+-J, if you count it for 10 seconds, it will be 1E7*K*N+-J, if you count it for 100 seconds, it will be 1E8*K*N+-J.

JW

S.Ma
Principal

Hmm and at what exact time the sensing measurement data was done?

KnarfB
Principal III

Actually you want take the time when a sensor value was sampled in the sensor, especially when the sample values are transported to the MCU via a relatively slow bus like I2C. Some sensors have a pin for that, e.g. an interrupt or trigger pin. You can setup a fast running hardware timer in input capture mode and get the timestamp when the external signal fired. Some timers can be clocked by an external clock, which can be designed to your needs. All in all, I wouldn't use DWT for that.

Since the application is intended to run as an add-on on a former HW I cant really add external components. For this reason as a first step I wanted to identify the accuracy I could achieve with the current setup. From what @Community member​ commented, I understand that I have to take into account the jitter.

As for the sensors the idea is to have a common 4-20 mA interface and have the option to set them up via a front end (units,type,etc). For this reason right now I want to consider the internal accuracy that I can get and calculate the standard deviation I can get from the integration.

Also the application is running each 1 millisecond by using the HAL ticks as time reference.