cancel
Showing results for 
Search instead for 
Did you mean: 

possilbe inaccuracy at the sampling frequency for LIS2DS12

LBaum.2
Associate II

Hello ST community,

I am interested in measuring the acceleration with two LIS2DS12 sensors at the same time. Therefore, I start the measurement (nearly at the same time) with a sampling frequency of 800Hz for 10 seconds with +-2g full scale range for both sensors. The test case is the following. Both sensors lie on the table and I apply slight shocks at the table.

My expected result is seeing the vibrations at the sensor data at the same time at both sensors. Maybe there is an offset between the two sensors due to an inaccuracy at starting both at the same time. But this offset should remain constant over the 10 seconds measurement time.

As you can see in the attached picture (measurement.png), I succecfully measured the vibrations with both sensors. There is also the offset due to the inaccuracy in starting both sensors at the same time. But if we take a closer look the offset will not remain constant. It shifts. Picture first_shock.png as the name says, shows the first shock of the measurement. The last measurement point of the first sensor before the shock occurs is at about 74ms and the same for the second sensor is at about 393ms. So the offset between the two sensors for the first shock is 319ms. If we take a look at the last_shock.png we will see an offset of 233ms between the to sensors.

Therefore, my question is how is this possible? The only two ideas I had was that I lose samples but I don't now why this should happen. And there could be a inaccuracy in clock control at the LIS2DS12 so that the sampling frequency changes over time and therefore this shift of the offset time occurs.

Further information:

Both sensors have the same configuration (800Hz, 10s, +-2g) and are in continuous mode. I read the data if there are more than 2 samples in the fifo, so the time for the read reqeust (~970µs) is less than 1/ODR (1.25ms for 800Hz). I also check the register 0x2F for fifo overflow or fifo full at each read request so no data gets lost.

1 ACCEPTED SOLUTION

Accepted Solutions

Hi Lukas @Community member​ ,

I checked with some colleague and the discrepancies in actual ODR value is common (it varies for a lot of physical reasons), both for average and punctual values.

checking the values, there is a deviation of 1.2% from the nominal 800Hz.

this value is not common (usually it is lower) but it is not possible to be perfectly precise.

If, for your application, you need an ODR more precise that this, you can follow 2 different paths:

  • characterizing the ODR how you did, you found the difference in average ODR, so you can compensate this difference by discarding some sample in the faster sensor
  • you can also try to run with a higher ODR and see if the situation gets better. this way you can at least have more samples and be able to decide which ones to use

hope this helps

Niccolò

View solution in original post

5 REPLIES 5
niccolò
ST Employee

Hi @Community member​ ,

this is indeed a particular behaviour.

if you check the FIFO full and overflow bits there should be no problem with lost samples, so we can cross that out.

if there were discrepancies in the ODRs, the problem should be different, nonetheless.

let me make an exaggeration to explain myself: lets say the discrepancy is huge, ODR1 is 800Hz and ODR2 is 400Hz.

in this case, you should have only half the samples for sensor 2 at the end of the 10 seconds.

this leads me to ask you if the number of samples at the end of the 10 seconds is the same for both sensors.

I'm guessing that the temporization is provided via software, and not by hand (in the second case this question is not important, but I would change the setup to try it)

another thing I would ask you is how does this discrepancy change.

from the picture where I see all the data I cannot get how much ms are there between the middle shocks.

maybe you can also get a longer acquisition to see if it grows even larger.

let's investigate

Niccolò

Hi @niccolo.ruffini​ ,

thank you for your fast reply.

You are right, the temporization is provided by software. To be exact I wait until each sensor has measured 8000 samples (800Hz * 10s). During the measurement I count the counter ticks with the function RTCDRV_GetWallClockTicks64() and convert it into milliseconds. I noticed that one sensor takes about 10019ms and the other about 10119ms for the whole measurement, so there is also a difference of 100ms. This is another reason why I think there could be discrepancies in the ODRs.

Regarding to the discrepancy in the middle of the picture I can supply you with all the values from the first to the last shock:

319ms; 306ms; 294ms; 286ms; 278ms; 265ms; 258ms; 248ms; 233ms

I took another measurment (with the same settings of course) and I can also provide the time differences from this :

515ms; 504ms; 533ms; 564ms; 425ms; 378ms; 400ms; 371ms; 381ms; 371ms; 355ms

By now I can not see a pattern in the change of the discrepancies. It seems random for me.

-Lukas

Hi @niccolo.ruffini​ ,

I had another idea to determine the real ODR. I thought I can route the DRDY (data ready) bit of the status register to INT1 pin to count the generated interrupts each time a measurement point is taken. Therefore, I set the INT1_DRDY bit in CRTL4 register and also the DRDY_PULSED bit in CTRL5 register to '1'. (the latter generates pulses if new data is sampled)

With the help of a logic analyzer I monitored the INT1 pin for 30 seconds and counted the pulses on this pin for each 10 seconds slot. So it is possible to calculate the mean frequency and with this also the time it would take to get 8000 samples.

I did this for both sensors. You can see the results in the picture count_drdy_compare.png.

I observed that the mean frequencies of both sensors are not 800Hz as expected and they also differ from each other by about 8Hz. The time needed for 8000 samples, calculated with the mean frequency coincides very well with the measured time from the contoller, which I mentioned in my last reply.

In my oppinion this is the reason for the shifting offset time I mentioned in my post.

This leads me to ask you if you know this behavior of the sensor and do you have a solution for this problem?

-Lukas

Hi Lukas @Community member​ ,

I checked with some colleague and the discrepancies in actual ODR value is common (it varies for a lot of physical reasons), both for average and punctual values.

checking the values, there is a deviation of 1.2% from the nominal 800Hz.

this value is not common (usually it is lower) but it is not possible to be perfectly precise.

If, for your application, you need an ODR more precise that this, you can follow 2 different paths:

  • characterizing the ODR how you did, you found the difference in average ODR, so you can compensate this difference by discarding some sample in the faster sensor
  • you can also try to run with a higher ODR and see if the situation gets better. this way you can at least have more samples and be able to decide which ones to use

hope this helps

Niccolò

Hi Niccolò @niccolo.ruffini​,

thank you for the reply and your efford.

You helped me a lot.

For your information, I tried 1600Hz ODR and the mean frequencies are 1579Hz and 1596Hz for the second sensor. So deviations of 1.31% and 0.25% which is nearly the same as with 800Hz ODR. The deviations in hertz have just doubled like the ODR.

-Lukas