2022-12-08 01:21 AM
Our Implementation :
We use the LSM6DSR sensor in FIFO Stream Mode (With Acc & Timestamp Enabled)
ACC ODR is set to 52Hz
ACC Batching is set to 52Hz
Timestamp decimation is set to 32
This timestamp counter is translated to real time by using the resolution factor derived from the FREQ_FINE register calculations as mentioned in the application note AN5358.
The calculation of TS Resolution is : tsResolution = (float) 1000000 / (40000 * (float) (1.0f+ 0.0015f * FREQ_FINE));
The above tsResolution factor is what we multiply to the Timestamp sensor values from the sensor FIFO buffer :
((double) acc_slot[i].timestamp / 1000000 * tsResolution)
Observation:
The calculated real world seconds elapsed is close to the calculated time for the first few minutes and then there is a slow drift observed where the calculated time starts to lag. On a long run test we observed that the error of the calculated timestamp is approximately 0.5 microseconds.
2022-12-09 07:26 PM
Well without a precise clock coming from a crystal or oscillator, an internal less precise one is probably built in. STM33 HSI has clock tolerance defined in datasheet.
2022-12-15 12:53 AM
Hi @Sachin ,
please note also that the "52Hz" value is a typical value: for a more precise characterization of the Output data rate of your specific device, I suggest you enable the DRDY and count the edges at ODR=52Hz. You can compare the measured ODR with the calculated ODR and check if there are discrepancies.
ODR_Actual = (6667 + ((0.0015 * INTERNAL_FREQ_FINE) * 6667)) / ODR_Coeff
-Eleon