2024-05-16 01:19 PM
Hello everybody,
We are currently evaluating the MLC of the LSM6DSOX and have a couple of general questions regarding feature calculation in MEMS Studio.
I've included excerpts from the application note for quick reference.
Zero-Crossing:
"The feature “Zero-crossing” computes the number of times the selected input crosses a certain threshold. This internal threshold is defined as the sum between the average value computed in the previous window (feature “Mean”) and hysteresis defined by the user."
Could you please explain how MEMS Studio calculates this feature for the first window, which doesn't have a preceding window for reference?
Peak Detector:
"The feature “Peak detector” counts the number of peaks (positive and negative) of the selected input in the defined time window.
A threshold has to be defined by the user for this feature, and a buffer of three values is considered for the evaluation.
If the second value of the three values buffer is higher (or lower) than the other two values of a selected threshold, the number of peaks is increased.
The buffer of three values considered for the computation of this feature is a moving buffer inside the time window."
Are these three values directly neighboring?
Lastly, when selecting a filter, it appears that MEMS Studio cuts a certain number of values at the beginning of a class dataset.
Is this to account for the filter settling time?
It would be very helpful if you could explain how to derive the number of data points cut at the beginning based on the filter coefficients and ODRs.
Thank you very much for your assistance,
Luis
Solved! Go to Solution.
2024-05-22 12:09 AM
Hi @luik ,
Here the answers:
Zero-Crossing: The average value is initialized to 0, so for the entire first window this value will be zero. once a full window of data is collected, then the average is updated.
Peak Detector: Correct, the 3 values are neighbors and acquired in sequence from time t, to time t-2.
Is this to account for the filter settling time? Correct, if at least one filter is selected (regardless of whether features are computed using filters) 2 rows of features are discarded to account for filter settling time and be aligned to hardware behavior.
N.B. The data itself is not dropped, filters and features are still being computed using all the data, simply the first 2 resulting rows with the computed features will not be written to the ".arff" file.
It would be very helpful if you could explain how to derive the number of data points cut at the beginning based on the filter coefficients and ODRs.
Unfortunately, the value of feature rows dropped when using filters is fixed to 2, and therefore not dependent on the ODR value.
2024-05-22 12:09 AM
Hi @luik ,
Here the answers:
Zero-Crossing: The average value is initialized to 0, so for the entire first window this value will be zero. once a full window of data is collected, then the average is updated.
Peak Detector: Correct, the 3 values are neighbors and acquired in sequence from time t, to time t-2.
Is this to account for the filter settling time? Correct, if at least one filter is selected (regardless of whether features are computed using filters) 2 rows of features are discarded to account for filter settling time and be aligned to hardware behavior.
N.B. The data itself is not dropped, filters and features are still being computed using all the data, simply the first 2 resulting rows with the computed features will not be written to the ".arff" file.
It would be very helpful if you could explain how to derive the number of data points cut at the beginning based on the filter coefficients and ODRs.
Unfortunately, the value of feature rows dropped when using filters is fixed to 2, and therefore not dependent on the ODR value.