cancel
Showing results for
Did you mean:

# VL53L8CX Simulation

Associate II

I'm attempting to mimic the ideal operation of the VL53L8CX in simulation. I've traced out 64 rays from the sensor within the the 45x45 degrees FOV as shown. Consider the top row (underlined in black) the rays are drawn at a vertical angle of 19.6875 and a horizontal angle of [-19.6875, -14.0625, -8.4375, -2.8125, 2.8125, 8.4375, 14.0625, 19.6875] with the centre positioned at (x, y, z) = (0, 0, 0). The intersection points with the plane presented slighly drop down resulting in a curve as shown by the black line due to the plane being closer to the sensor for the smaller horizontal angles. This ensures the ray is centred in each ~5x5 pixel. Does this accuretly depict the sensor operation? Please find the picture attached.

Kind regards,

David

1 ACCEPTED SOLUTION

Accepted Solutions
ST Employee

David -

The trouble is that there are no 'rays' as you call them. The flash of light goes out in a flood. All the light that comes back hits a number of Single photon Avalanche Diodes in a square array. Each zone is about 5x5 degrees. If an object has a edge in one of the zones, the distance in that zone will be the average of the near surface and the far surface. (This assumes the two distances are relatively close. If they are farther than 60cm, you will see both distances instead of one merged distance.) And by 'average' I mean average of number of photons. Near objects return more photons than object farther away.
But in simulation, if you point the sensor at a flat wall, all the distances should be the same. The 4 corners might not detect a distance where the others do. Corners cannot see as far.
Hope that helps.
And a note on accuracy... you have angles at -19.6875 etc. Nothing about this chip is going to be accurate to that precision. At best maybe a quarter of a degree.

Even the 45x45 is suspect. At close distances it's more like 47x47 and at the far distances 43x43. That 45x45 is an effective average.

Our community relies on fruitful exchanges and good quality content. You can thank and reward helpful and positive contributions by marking them as 'Accept as Solution'. When marking a solution, make sure it answers your original question or issue that you raised.

ST Employees that act as moderators have the right to accept the solution, judging by their expertise. This helps other community members identify useful discussions and refrain from raising the same question. If you notice any false behavior or abuse of the action, do not hesitate to 'Report Inappropriate Content'
3 REPLIES 3
ST Employee

Almost. We do the radial-to-perpendular translation for you.

So if you pointed the sensor at a flat wall, exactly perpendicular, you should get the same number for all zones.

please note however that there was a mistake in the early drivers where the numbers were not perfect.

I'd check the verision of code you have, and if there is a more recent version - take it.

The only change was to these numbers, so the upgrade should be quick.

- john

Our community relies on fruitful exchanges and good quality content. You can thank and reward helpful and positive contributions by marking them as 'Accept as Solution'. When marking a solution, make sure it answers your original question or issue that you raised.

ST Employees that act as moderators have the right to accept the solution, judging by their expertise. This helps other community members identify useful discussions and refrain from raising the same question. If you notice any false behavior or abuse of the action, do not hesitate to 'Report Inappropriate Content'
Associate II

Hi @John E KVAM,

I know about the radial-to-perpendicular translation. I'm trying to simulate (mimic) the operation of the sensor in MATLAB. This is not actual data from the sensor. Would this be a fair representation of what the rays would potentially look like before the radial-to-perpendicular translation occurs? Does this make sense I can elaborate more?

Kind regards,

David

ST Employee

David -

The trouble is that there are no 'rays' as you call them. The flash of light goes out in a flood. All the light that comes back hits a number of Single photon Avalanche Diodes in a square array. Each zone is about 5x5 degrees. If an object has a edge in one of the zones, the distance in that zone will be the average of the near surface and the far surface. (This assumes the two distances are relatively close. If they are farther than 60cm, you will see both distances instead of one merged distance.) And by 'average' I mean average of number of photons. Near objects return more photons than object farther away.
But in simulation, if you point the sensor at a flat wall, all the distances should be the same. The 4 corners might not detect a distance where the others do. Corners cannot see as far.
Hope that helps.
And a note on accuracy... you have angles at -19.6875 etc. Nothing about this chip is going to be accurate to that precision. At best maybe a quarter of a degree.

Even the 45x45 is suspect. At close distances it's more like 47x47 and at the far distances 43x43. That 45x45 is an effective average.

Our community relies on fruitful exchanges and good quality content. You can thank and reward helpful and positive contributions by marking them as 'Accept as Solution'. When marking a solution, make sure it answers your original question or issue that you raised.

ST Employees that act as moderators have the right to accept the solution, judging by their expertise. This helps other community members identify useful discussions and refrain from raising the same question. If you notice any false behavior or abuse of the action, do not hesitate to 'Report Inappropriate Content'