Can anyone explain to me how to calculate the correct acceleration values in mg for various resolutions and scales?
For example, I am currently reading acceleration values with ODR set to 0110, which is HR / Normal / Low-power mode (200 Hz). The Full scale selection is set to +/-2g for this example.
As both the LPen and the HR bit are set to 0, I am in Normal mode with a 10-bit data output. If I shift the values >> 4, I am getting values which seem to be correct. For example 1000 mg (= 1g) on the z-axis, which makes sense I guess, as it is 1*gravitational acceleration.
On the right of the table above is written: +/- 2 g [mg/digit]. What does that mean? How am I able to convert the values I read into mg? Not only for +/- 2g, but for a fullscale of +/-4g or +/-8g or +/-16g too?