2022-05-30 08:45 PM
Between two batches of ISM330DHCX and LSM6DSR sensors (first batch purchased in 2020, second batch purchased in 2021), large differences in gyro sensitivity are seen. For a single 90 degree turn, the first batch exhibits several degrees of error - and the second batch exhibits up to 30 degrees of error. The firmware is identical in each case..
What is the expected error in gyro sensitivity we should expect?
Are these variances in gyro sensitivity between different production batches of sensors expected?
2022-06-03 08:16 AM
Hi @SLi.9bert ,
please consider first that the gyroscope output is an angular velocity (in degree per second), so that in order to get the angular data you have to integrate the signal in time (and taking into account of the used ODR).
Typically, the integration is a process that accumulates the zero rate level and noise errors, so that if you don't "calibrate" the gyroscope (at least for the ZRL which can vary from lot to lot, and from one part number to the other), you will have this error accumulation effect. You can do this in post processing or tuning the X_OFS_USR, Y_OFS_USR and Z_OFS_USR registers).
So I suggest you to check this calibration phase first.
-Eleon
2022-06-03 11:05 AM
Gyro zero rate level is calibrated continuously in firmware using the MotionGC Library; when unit is still, after bias removal the gyro rate is zero for all 3 gyro axes; we can see the MotionGC bias is being updated several times a second.
The gyro (and accel) ODRs are 416Hz.
Raw data is sampled via FIFO; each sample is 3Gyro, 3Accel and a timestamp; timestamp is configured at 25us resolution. delta_t == difference between the current timestamp and the last timestamp. delta_t is stable on each batch of sensors at .0024s.
Before integration, MotionGC gyro biases are removed, and acclerometer data is corrected using a Motion_AC correction matrix (if available).
Two separate fusion algorithms are used- the MotionFX kalman-filter based, and a Madgwick gradient descent algorithm, so we can compare the differences. Both show the same sensitivity error: about ~2% on the older ISM330DHCX chips; on the newer batch, w/same firmware, the gyro sensitivity error is about ~6% on the newer batch of ISM330DHCX.
Datasheet sensitivity error range is +/- 2%. Using same firmware on both boards, we consistently see much higher sensitivity error on this newer batch, and several hundred boards have been tested. So this is the unexpected result.
Trying to isolate differences, the only change between batches of boards are (a) different batch of ISM330DHCX sensors and (b) the board production changed from using HASL to ENIG plating (which should not impact iNEMO behavior).
If we manually calibrate gyro sensitivity correction factor and apply it in firmware, the integrated gyro angle is very close to the expected.
We verified zero rate level calibration, as you suggested - what else could account for this difference in sensitivity error?
For example, could there a difference in the actual vs reported timestamp duration between batches of sensor?