2021-10-27 04:47 AM
Solved! Go to Solution.
2021-10-27 04:47 AM
There is no universal rule for the number of data-logs and their length. The more the acquired log describes accurately the scenario the user wants to identify, the less data is required. Basically, if your dataset is “good�?, you don’t need much data to train an MLC.
Dataset can be defined “good�? if:
- the dataset logs describe only one scenario per log
- every log shows no external noise which is not correlated to the scenario itself
- every log is acquired using the same sensor settings.
For example, if these three rules are met, a dataset used to train an MLC to detect between 3 different scenarios can consist in three different logs of 30 seconds each.
However, long, and repeated acquisitions can be used to train the ML in case there is some noise overlapped with the signal of interest (and that can’t be removed). Also, more logs can enhance the accuracy of the MLC.
2021-10-27 04:47 AM
There is no universal rule for the number of data-logs and their length. The more the acquired log describes accurately the scenario the user wants to identify, the less data is required. Basically, if your dataset is “good�?, you don’t need much data to train an MLC.
Dataset can be defined “good�? if:
- the dataset logs describe only one scenario per log
- every log shows no external noise which is not correlated to the scenario itself
- every log is acquired using the same sensor settings.
For example, if these three rules are met, a dataset used to train an MLC to detect between 3 different scenarios can consist in three different logs of 30 seconds each.
However, long, and repeated acquisitions can be used to train the ML in case there is some noise overlapped with the signal of interest (and that can’t be removed). Also, more logs can enhance the accuracy of the MLC.