2023-08-06 07:44 AM
Hi,
I have two LPS28DFW in my application. The absolute value as well as the difference is important here. I noticed that there is a difference of around 30 Pa. Is it reasonable to assume that this difference is roughly constant in the future? I tested it for around 5000 seconds (see attached scatter plot and histogram for the difference) and it was pretty stable (sensor temperature was ~31°C in both cases). I am also going to test it for a longer time period but I'd like to know from a theoretical standpoint if I can assume this to be roughly constant and if I can use this for a "factory calibration" of my application to improve accuracy.
Thanks for your help.
2023-08-06 07:56 AM
Per the data sheet, the devices are factory calibrated, so if they're registering 30 Pa off, that means the reading has changed between factory calibration and now.
The datasheet specifies ± 50 Pa difference from soldering, which is in line with what you're seeing, and ± 100 Pa drift per year. These are specified as "typical" values, not max, so that's roughly how you should expect them to perform long term.
2023-08-06 12:10 PM
Thanks for the information. Is there some recommended calibration procedure which I can implement to improve accuracy? Is it just the offset/bias or is a more complicated temperature dependent calibration necessary? In any case how does a function look like which applies calibration parameters?