2025-03-11 12:21 PM - last edited on 2025-03-12 9:04 AM by Andrew Neil
Hello,
This is in regard to the question I posted at the webinar on AI-powered gesture recognition with Time-of-Flight sensors held on 3/11.
Can the orientation of the VL53L8CX ToF sensor be detected in case it is moved and possibly gets inverted after training is done?
I'm interested in correctly detecting gestures like LIKE/DISLIKE and SWIPE RIGHT/LEFT even if the sensor has been turned upside down after the overall device is trained. The solution should work correctly for left- and right-handed persons. It's ok to add MEMS sensors to the overall project but I prefer trying to solve problems with software first rather than just throw hardware at them, as long as the extra software doesn't bog down the MCU.
Thanks in advance and best regards,
Bruce
Solved! Go to Solution.
2025-03-12 7:25 AM
Thanks for your question.
You can use your sensor in any orientation, the only thing you must do is to "rotate" the sensor data in term of index.
By default, the zone 0 is on the top right corner and the zone 63 is on the bottom left. So if your sensor is inverted in your final application, you can implement something like:
int main() {
int originalArray[64];
int reversedArray[64];
// Initialize the original array with values from 0 to 63
for (int i = 0; i < 64; i++) {
originalArray[i] = i;
}
// Reverse the elements
for (int i = 0; i < 64; i++) {
reversedArray[i] = originalArray[63 - i];
}
// Print the reversed array
for (int i = 0; i < 64; i++) {
printf("%d ", reversedArray[i]);
}
return 0;
}
Them you feed the algorithms with this new sensor data map.
Yann
2025-03-11 1:33 PM
First, thanks for attending the seminar. I thought it was fun, and I was glad we could do it.
In order to find a hand, the sensor is ranging and running. And generally, there is a person behind the hand.
So, your image is going to look maybe something like this:
The posture code isolates the zones that are close - like 326. (camera is not aligned very well, but the picture is
But have a look at the zones is the darker red.
Group all the ones that are in the 750 to 850 range. I'd say that shape looks a lot like a person. And if your sensor were upside down, that shape would also be upside down.
I'd use the person's shape as your detection mechanism.
Thanks again,
- john
2025-03-12 7:25 AM
Thanks for your question.
You can use your sensor in any orientation, the only thing you must do is to "rotate" the sensor data in term of index.
By default, the zone 0 is on the top right corner and the zone 63 is on the bottom left. So if your sensor is inverted in your final application, you can implement something like:
int main() {
int originalArray[64];
int reversedArray[64];
// Initialize the original array with values from 0 to 63
for (int i = 0; i < 64; i++) {
originalArray[i] = i;
}
// Reverse the elements
for (int i = 0; i < 64; i++) {
reversedArray[i] = originalArray[63 - i];
}
// Print the reversed array
for (int i = 0; i < 64; i++) {
printf("%d ", reversedArray[i]);
}
return 0;
}
Them you feed the algorithms with this new sensor data map.
Yann
2025-03-12 7:27 AM
Regarding your question about left- and right-handed persons, for the Handposture AI solution, we are augmenting our dataset using the horizontal mirroring (option can be disable in the YAML config file), so all postures are duplicated (left-hand becomes right-hand and the opposite).
Yann