2021-09-18 8:20 PM
Hi, I was trying to create a small 8x8 / 4x4 point cloud using a VL53L5CX sensor, and I can get the depth data for each zone, but I was wondering if there was a way to get x,y,z from this information? Like if there was a way to know the vertical and horizontal angle for every zone on the VL53L5CX and then calculate the x,y,z location of the points using that and the depth data.
Any help would be appreciated!
2024-09-25 7:56 AM
Ah no worries I got it working in the end.
Cheers
2026-04-27 3:58 AM
Hi @John E KVAM ,
when I project this code onto a plane normal to the sensor I don't get a perfect square, it's kind of squarish with the corners pulled out. Is this correct? If so, what's the reason the optics aren't evenly spaced in a square pattern?
There also appears to be a copy error in row 0, column 6 from the row above in the yaw matrix (both are 203.2). Presume this should actually be 215.4 to make the matrix symmetrical and it's not a quirk in how the lenses are manufactured?
Thanks,
James
2026-04-29 7:15 AM
I was wondering if someone would question those numbers. Well done.
So here is what is going on...
A perfect lens is almost flat, and a long way from the sensor. But this chip is only 1.6mm tall.
So compromises are the name of the game. The 'lens' is a meta-lens (think Fresnel lens only MUCH smaller) and jammed up against the SPAD sensor.
The guy who came up with those numbers ran a huge number of experiments, and then submitted his findings to the chip designer who verified the math.
And yes, the copy error was noted before. I'm very sorry about that.
It was my fault. But now that I'm retired, I can't go back and edit the prior posts.
I just hope people who are interested, will read the entire thread.
- john
2026-05-02 5:53 AM
Thanks for the explanation @John_Kvam, particularly out of retirement!