cancel
Showing results for 
Search instead for 
Did you mean: 

H3LIS331DL: Threshold levels

RDyla
Associate II

I have the FS bits set for +/-100g scale.

The accel data is a 12-bit signed value, meaning each bit represents 0.049g.

The threshold registers however is only a 7-bit value. From what I understand, this is to be an unsigned magnitude level, representing 0 to full scale. So, each bit represents 0.781g?

My issue is that when I get interrupts and read the data, sometimes the levels are below the threshold. For example:

Threshold register set to 3 = 3*0.781 = 2.344g

An interrupt occurs and the maximum accel (on any 1 axis) register value is 37. Which is 37*0.049 = 1.813g?

Just hoping for some confirmation that I'm understand these values correctly.

Thank you for any input.

1 ACCEPTED SOLUTION

Accepted Solutions
Eleon BORLINI
ST Employee

hi, yes, you're right. The selectable threshold is on an unsigned 7-bits and goes from 0x00 to 0x7F, where 0x7F is the FS (e.g. +-100g), meaning 0.781g/LSB.

Remember to select axis direction and sign in the INT1_CFG reg (30h).

You say you receive an interrupt but in the meantime (are you exactly synchronous?) you are reading an output value lower than that threshold. Which ODR are you using?

Ps: consider however that the data you read is different from the set threshold of less than 1 threshold LSB...

View solution in original post

3 REPLIES 3
Eleon BORLINI
ST Employee

hi, yes, you're right. The selectable threshold is on an unsigned 7-bits and goes from 0x00 to 0x7F, where 0x7F is the FS (e.g. +-100g), meaning 0.781g/LSB.

Remember to select axis direction and sign in the INT1_CFG reg (30h).

You say you receive an interrupt but in the meantime (are you exactly synchronous?) you are reading an output value lower than that threshold. Which ODR are you using?

Ps: consider however that the data you read is different from the set threshold of less than 1 threshold LSB...

Thanks for the quick response.

Yes, I'm setting the INT1_CFG to enable all 3 axis, high events only.

I'm running in low power mode with 10 Hz ODR.

Upon interrupt I'm first reading all 6 registers containing the x/y/z values. Then I'm clearing the interrupt reading the INT1_SRC register. The time between the interrupt and clear is near 300us (0.3ms) total.

I understand the discrepancy is small, but because it seemed there was possible error I just wanted to make sure I have these units correct.

understood. ​Did you check if in Normal Mode the same discrepancy occurs? just to understand if the ODR change from 10Hz and 50Hz can influence the discrepancy and the 12-bit result is confirmed