cancel
Showing results for 
Search instead for 
Did you mean: 

LIS3MDL zero gauss level and factory offset

CCamp.1
Associate II

I'm looking for an AMR 3-axis sensor that is stable and factory calibrated for both gain and offset, say +- 1uT, that I don't need to do any per-chip calibration.

The data sheet for the LIS3MDL indicates that zero-field is +-1 gauss, so 100 uT, and has a statement in section 3.3 that it is factory calibrated. The datasheet does not say anything about the registers involved, or the specification of the factory calibration. If it is +- 1gauss after calibration, that's terrible, but it's unclear.

The app note for the part adds some information, giving the address and description of the offset calibration registers, but it does not mention factory calibration, how accurate it is, or even if the output has these values automatically subtracted.

The goal is a decent compass in an earth field of 60uT or so without any calibration other than what the factory has done.

I also note that the datasheet does not actually state if this is an AMR or a Hall sensor, which have different stability behaviors, nor does it indicate, assuming that it's an AMR, how often the bridge is re-aligned with the coils. Does it take a power cycle to do this, or does a reset do this, or is it done every reading?

Best regards,

Craig

3 REPLIES 3
CCamp.1
Associate II

OK, so I bought a dev board and wrote some code.

The "factory calibration" that I read out of the offset registers documented in the app note is 0, 0, 0.

Wow, I got a perfect part!

Except that it clearly has offsets on all axis as is easily shown by rotating it, so I'm not sure what "factory calibration" means, but it's not useful for eliminating offset calibration requirements.

So I got a second part to try to check for any level of consistency. The 2 that I checked both show 0,0,0 for offset, and vary by 5 to 20 uT when reading the same field depending on the axis, so offset calibration is clearly required.

I have no idea what the statements in the datasheet or app note mean by factory calibration.

Anyone have a clue?

Craig

Eleon BORLINI
ST Employee

Hi Craig @CCamp.1​ ,

well, factory calibration accounts for the calibration during the testing.

There may be many physical and environmental factors that may affect an AMR magnetic sensor the after the testing phase.

The suggestion is to always calibrate the sensor, especially for precision and long-duration applications. It's not strictly necessary to physically move the magnetometer to calibrate it. You can use pre-built libraries such as the MotionMC magnetometer calibration library in the X-CUBE-MEMS1 package, at least for hard iron (HI) and scale factor coefficients corrections. You will need to only use the following functions (platform independent):

void MotionMC_Initialize(int sampletime, unsigned short int enable)
void MotionMC_Update (MMC_Input_t *data_in)
void MotionMC_GetCalParams (MMC_Output_t *data_out)

-Eleon

Thank you for your response Eleon.

While I understand that it's not strictly necessary to move the magnetometer to execute the calibration routines, I don't believe that the result is meaningful unless you do. There just isn't any information about the hard or soft iron fields without a variety of different orientations.

The root of my questions is what is meant by "factory calibration", and how good is it.

From what I see on the parts, it does not mean anything other than it is used in order to meet the datasheet values for offset and gain tolerance, which are very loose. This means that all parts must be user calibrated for meaningful results, like all of the other parts out there, and that the datasheet values reflect the performance and tolerance after this factory calibration is applied.

Not unexpected I guess, I was hopeful that the reference to factory calibration would provide this part with reasonable accuracy without the usual need to calibrate the sensor with tumbling or a zero-gauss chamber, but this just isn't the case.

Best regards,

Craig