cancel
Showing results for 
Search instead for 
Did you mean: 

Wearable with LSM6DSMTR

ahboyboy
Associate

 

Could someone provide insights on how to identify the following user actions using a wearable integrated with LSM6DSMTR:

  1. Distinguishing between lying down, sitting, and standing.
  2. Tracking step count.
  3. Detecting if the user is walking on an incline or stairs.
  4. Estimating the number of hours of sleep.

Additionally, I'm seeking advice on whether it's preferable to process these actions at the edge or to transmit raw data to the cloud for processing. Thanks

2 REPLIES 2
Federica Bossi
ST Employee

Hi @ahboyboy ,

Have you already look at AN4987

In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.
ahboyboy
Associate

Thank you for your response. After reviewing AN4987, I understand that we can extract steps and activity/inactivity information. However, I would like to further classify the accelerometer and gyroscope raw data into specific postures such as lying down, sitting, and standing.

I noticed that ST supports raw data classification, as demonstrated in this video: https://www.youtube.com/watch?v=4btLVDR68TQ

To achieve this, I am considering using the following tools:

However, since we have integrated the LSM6DSMTR with another MCU into a wearable device without a USB connection, I am wondering if I should store the raw data and transfer it via Bluetooth? Are there any alternative methods to collect labeled data?