I am using the LIS2DH12 and get expected values in normal mode. When changing to high res mode the scaling is not as expected.
FS=11, BLE=0, LPen=0, HR=0 (10-bit normal mode, +/-16g FS). I expect the sensitivity to be 48mg/digit, and sure enough, if I right-align my 10-bit value and divide by 48 I get a value of about 1054 on the z-axis.
When I set HR=1, no other changes, I expect the sensitivity to be 12. I right-align the 12-bit z-axis reading and divide by 12, I get a value of the order of 16300.
Could someone suggest what I might be doing wrong?
Incidentally, in section 4.2.3 of AN5005 there is an example of the expected reading for 1g when set to FS of 2g in high res mode. With BLE=0 the supposed reading is 00h 40h. By my calculation, I believe this should have a sensitivity of 1 (table 3, section 3) and therefore equates to 1024mg, not 1g as documented. Would others agree with that?
