2019-11-11 12:12 AM
I did the temperature test and get the below graph. The Axis data always jumps at some temperature between 10-30 ℃.
Settings: 2g scale, high resolution mode and ODR is 100Hz.
Did anyone else have encounter such problem? The sensitivity stated on datasheet is 0x01%/℃.
2019-11-11 12:56 AM
Hi @Chelsea , which device (part number) are you testing? Regards
2019-11-17 11:39 PM
I am testing LIS3DH sensor
2019-11-17 11:40 PM
Hi, @Eleon BORLINI , I am testing LIS3DH sensor
2019-11-18 12:58 AM
To better understand the issue... do you perform your test starting from low temperature (-5°C) and slowly increasing the temperature itself, or did you start from 25°C, where the jump is more evident, going up and down in different times? How many data did you acquire for each graphical point, and do you have the temperature vs time graph, synchronized with that data graph? And, last question, which is your DUT Vdd and VddIO?
Btw, please consider that the 0.01%/°C Sensitivity change vs temperature value declared in the datasheet is a typical value valid for FS 2g, high performance mode selected (proper ODR) and Vdd = 2.5V, because the factory calibration is performed in these conditions. Regards
2019-11-18 03:55 AM
The testing process is listed below:
Actually, we just wonder why there is a so big gap within the "common" temperature range - usually happens within 5°C change between 0 - 30°C. The axis data shows the same trend (incremental or decremental) when temperature increases.
We have tested for 20 times, and all DUT have exactly one gap at individual temperature range.