Wearable with LSM6DSMTR
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2024-04-18 7:43 AM
Could someone provide insights on how to identify the following user actions using a wearable integrated with LSM6DSMTR:
- Distinguishing between lying down, sitting, and standing.
- Tracking step count.
- Detecting if the user is walking on an incline or stairs.
- Estimating the number of hours of sleep.
Additionally, I'm seeking advice on whether it's preferable to process these actions at the edge or to transmit raw data to the cloud for processing. Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2024-04-19 8:20 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2024-04-21 6:56 AM
Thank you for your response. After reviewing AN4987, I understand that we can extract steps and activity/inactivity information. However, I would like to further classify the accelerometer and gyroscope raw data into specific postures such as lying down, sitting, and standing.
I noticed that ST supports raw data classification, as demonstrated in this video: https://www.youtube.com/watch?v=4btLVDR68TQ
To achieve this, I am considering using the following tools:
- Unico GUI (https://www.st.com/en/development-tools/unico-gui.html)
- AlgoBuilder (https://www.st.com/en/development-tools/algobuilder.html)
However, since we have integrated the LSM6DSMTR with another MCU into a wearable device without a USB connection, I am wondering if I should store the raw data and transfer it via Bluetooth? Are there any alternative methods to collect labeled data?
