cancel
Showing results for 
Search instead for 
Did you mean: 

Calibration Values LSM6DS33/LSM9DS1

NSeid.1
Associate

Hello everyone

I'm currently working on a research project concerning IMUs, among others a LSM6DS33 and a LSM9DS1 module. In the datasheet for the LSM6DS33 and LSM9DS1 a factory calibration is mentioned (e.g. bottom of page 40). Since nothing else is mentioned about these calibration values I have several questions:

  • What exactly does this factory calibration entail and does the calibration change the sensor output?
  • If so how do the calibration values affect the output of the sensor/which model is used for the calibration?
  • Is it at all possible to obtain the uncalibrated sensor signals and how?
  • It is mentioned that the registers containing the factory calibration values should not be changed, but not that they can't be changed. Is it at all possible (at own risk of course) to change/ reset these calibration values? It says in the Data sheet that they will be restored at the next boot anyways.
  • The register mapping doesn't seem to provide the adress of these calibration values. Where are they stored?

I realize that these are pretty specific questions, to which the answers might not be public in the datasheet for certain reasons. However I would appreciate it if anyone with knowledge concerning the calibration values of these sensors could shed some light on this subject.

Thanks a lot in advance!

1 ACCEPTED SOLUTION

Accepted Solutions
Eleon BORLINI
ST Employee

Hi @NSeid.1​ ,

Which is the specific purpose of your request?

I'll try to answer your question at least for the parts that can be publicly disclosed.

First, the production tester calibrates (i.e. minimize) the accelerometer and gyroscope offset (zero g level and zero rate level). It also apply a compensation scheme that takes into account the change of the sensitivity versus temperature.

For this reason, these values are automatically applied to the raw data coming from the sensor, and the dataout is compensated with these values.

It is in theory possible, but strongly not recommended, to "un-calibrate" the sensor by overwriting the calibration value in volatile memory. When the device undergoes a reset, for example after a power OFF-ON cycle, the volatile memory is erased and the flash memory is back dominant.

The calibration values are stored in the reserved registers (for example, for the LSM6DS33 they are 02-05h registers, see datasheet p.38). But please pay attention: one of the issue is that -among these registers- there could be specific, rare bits coding test modes that, for example, change the polarity of a specific pad or change the function of a pin (e.g. from open drain to push-pull).

If my reply answered your question, please click on Select as Best at the bottom of this post. This will help other users with the same issue to find the answer faster.

-Eleon

View solution in original post

3 REPLIES 3
Eleon BORLINI
ST Employee

Hi @NSeid.1​ ,

Which is the specific purpose of your request?

I'll try to answer your question at least for the parts that can be publicly disclosed.

First, the production tester calibrates (i.e. minimize) the accelerometer and gyroscope offset (zero g level and zero rate level). It also apply a compensation scheme that takes into account the change of the sensitivity versus temperature.

For this reason, these values are automatically applied to the raw data coming from the sensor, and the dataout is compensated with these values.

It is in theory possible, but strongly not recommended, to "un-calibrate" the sensor by overwriting the calibration value in volatile memory. When the device undergoes a reset, for example after a power OFF-ON cycle, the volatile memory is erased and the flash memory is back dominant.

The calibration values are stored in the reserved registers (for example, for the LSM6DS33 they are 02-05h registers, see datasheet p.38). But please pay attention: one of the issue is that -among these registers- there could be specific, rare bits coding test modes that, for example, change the polarity of a specific pad or change the function of a pin (e.g. from open drain to push-pull).

If my reply answered your question, please click on Select as Best at the bottom of this post. This will help other users with the same issue to find the answer faster.

-Eleon

NSeid.1
Associate

First of all, thanks for the immediate response @Eleon BORLINI​ . It has already answered a lot of my questions so I selected it as best response.

The purpose of my request is to better understand the sensor signal that I am getting from the accelerometer and gyroscope.

As it is mentioned in the datasheets the zero-g and zero rate-level can change after mounting the sensor on a circuit board. As I am using third-party breakoutboards like this one from pololu an error in the zero-g and zero-rate level is present. I would like to calibrate these errors (and possibly further errors) to improve precision.

Now for the zero-rate level of the gyroscope this seems straightforward, as the factory calibration likely just measures an offset for each axis of the gyroscope in a steady state with no angular rate present. These calibration values are then added to the sensor signal to reduce the mean output in a steady state to zero. Now if in fact this is just a constant offset it is possible to just measure a new offset and apply it to the sensor signal afterwards to recalibrate (No need to reset the calibration value).

For the zero-g level it doesn't seem that simple since the acceleration on the sensor in a steady state is g depending on how the factory calibration affects the signal this could interfere with any sensor error model and further calibration. As you mentioned there is also a sensitivity calibration and temperature correction so this might be a problem. If however the zero-g calibration is also just an offset to achieve the 0g for X-Y-Axis and 1g on the Z-Axis on a horizontal surface (as is mentioned under terminology in the datasheet) then this shouldn't interfere with further calibration.

I assume the compensation for the sensitivity change uses the built in temperature sensor?

Overall I'm just trying to find out if the factory calibration prohibts identifying sensor error parameters like bias, axes misalignment or scale factors.

Thanks again for your help!

Hi @NSeid.1​ ,

the compensation in temperature for these two sensors is performed at production tester level, on the basis of a correlation matrix with the validation lab. Then data are stored in the calibration registers. But this variation is small across the temperature range.

For the accelerometer offset compensation, you could perform it via software (or use an high pass filter if you are not interested in DC acceleration), after having characterized it previously, in a steady condition. This is useful in particular to compensate any distortion due to mainly assembly process, especially soldering phase.

For the parameter not explicitly mentioned in the datasheet, some char data might be shared, but with all the caveats of the case.

-Eleon