2024-01-02 12:35 PM - edited 2024-01-02 12:46 PM
Hi @John E KVAM and @Anne BIGOT
I'm attempting to mimic the ideal operation of the VL53L8CX in simulation. I've traced out 64 rays from the sensor within the the 45x45 degrees FOV as shown. Consider the top row (underlined in black) the rays are drawn at a vertical angle of 19.6875 and a horizontal angle of [-19.6875, -14.0625, -8.4375, -2.8125, 2.8125, 8.4375, 14.0625, 19.6875] with the centre positioned at (x, y, z) = (0, 0, 0). The intersection points with the plane presented slighly drop down resulting in a curve as shown by the black line due to the plane being closer to the sensor for the smaller horizontal angles. This ensures the ray is centred in each ~5x5 pixel. Does this accuretly depict the sensor operation? Please find the picture attached.
Looking forward to hearing your reply and any suggestions!
Kind regards,
David
Solved! Go to Solution.
2024-01-03 07:55 AM
David -
The trouble is that there are no 'rays' as you call them. The flash of light goes out in a flood. All the light that comes back hits a number of Single photon Avalanche Diodes in a square array. Each zone is about 5x5 degrees. If an object has a edge in one of the zones, the distance in that zone will be the average of the near surface and the far surface. (This assumes the two distances are relatively close. If they are farther than 60cm, you will see both distances instead of one merged distance.) And by 'average' I mean average of number of photons. Near objects return more photons than object farther away.
But in simulation, if you point the sensor at a flat wall, all the distances should be the same. The 4 corners might not detect a distance where the others do. Corners cannot see as far.
Hope that helps.
And a note on accuracy... you have angles at -19.6875 etc. Nothing about this chip is going to be accurate to that precision. At best maybe a quarter of a degree.
Even the 45x45 is suspect. At close distances it's more like 47x47 and at the far distances 43x43. That 45x45 is an effective average.
2024-01-02 02:50 PM
Almost. We do the radial-to-perpendular translation for you.
So if you pointed the sensor at a flat wall, exactly perpendicular, you should get the same number for all zones.
please note however that there was a mistake in the early drivers where the numbers were not perfect.
I'd check the verision of code you have, and if there is a more recent version - take it.
The only change was to these numbers, so the upgrade should be quick.
- john
2024-01-02 03:12 PM
Hi @John E KVAM,
I know about the radial-to-perpendicular translation. I'm trying to simulate (mimic) the operation of the sensor in MATLAB. This is not actual data from the sensor. Would this be a fair representation of what the rays would potentially look like before the radial-to-perpendicular translation occurs? Does this make sense I can elaborate more?
Kind regards,
David
2024-01-03 07:55 AM
David -
The trouble is that there are no 'rays' as you call them. The flash of light goes out in a flood. All the light that comes back hits a number of Single photon Avalanche Diodes in a square array. Each zone is about 5x5 degrees. If an object has a edge in one of the zones, the distance in that zone will be the average of the near surface and the far surface. (This assumes the two distances are relatively close. If they are farther than 60cm, you will see both distances instead of one merged distance.) And by 'average' I mean average of number of photons. Near objects return more photons than object farther away.
But in simulation, if you point the sensor at a flat wall, all the distances should be the same. The 4 corners might not detect a distance where the others do. Corners cannot see as far.
Hope that helps.
And a note on accuracy... you have angles at -19.6875 etc. Nothing about this chip is going to be accurate to that precision. At best maybe a quarter of a degree.
Even the 45x45 is suspect. At close distances it's more like 47x47 and at the far distances 43x43. That 45x45 is an effective average.