2024-04-18 07:43 AM
Could someone provide insights on how to identify the following user actions using a wearable integrated with LSM6DSMTR:
Additionally, I'm seeking advice on whether it's preferable to process these actions at the edge or to transmit raw data to the cloud for processing. Thanks
2024-04-19 08:20 AM
2024-04-21 06:56 AM
Thank you for your response. After reviewing AN4987, I understand that we can extract steps and activity/inactivity information. However, I would like to further classify the accelerometer and gyroscope raw data into specific postures such as lying down, sitting, and standing.
I noticed that ST supports raw data classification, as demonstrated in this video: https://www.youtube.com/watch?v=4btLVDR68TQ
To achieve this, I am considering using the following tools:
However, since we have integrated the LSM6DSMTR with another MCU into a wearable device without a USB connection, I am wondering if I should store the raw data and transfer it via Bluetooth? Are there any alternative methods to collect labeled data?