cancel
Showing results for 
Search instead for 
Did you mean: 

VL53L5CX Distances Don't Match Theory - What am I missing?

Inq
Associate II

I am on a robotic forum where we have been studying using this VL53L5CX for robotic vision, mapping and SLAM for the last two months. We are on page 12 of this thread: "<link provided if requested and I'm permitted>"

The problem we have run into concerns the distance readings we're getting out of it. The problem is explained in detail in the second half of this post "<link provided if requested and I'm permitted>" with pictures and previous posts to that thread give detailed output from the sensor.

Since I am new to this forum and can't provide the links, I'll summarize the problem here:

  1. We have the sensor square-on to a wall 1000 mm away.
  2. The wall is house-painted white
  3. The ambient lighting is very low (barely readable)
  4. Since the FoV angle from corner to corner is ~64 degrees, the corner distances should follow standard Trigonometry theory and be further away than the central pixels.

Theory says, if the center is 1000 mm away, the distance to the center of all four corner pixels should be ~1112 mm away. The sensor is only returning values close to 1018 mm away. IOW, Not anywhere near close to theory. We have tried many different settings for Integration Time (5 to 100 ms) and Sharpness Percent (0 to 99%).

Because it is SO wrong, I feel like I'm missing something fundamental here. Maybe someone here can enlighten me. No one on our forum has seen a problem with our logic and expectations.

Thank you for your time.

VBR,

Inq

1 ACCEPTED SOLUTION

Accepted Solutions
John E KVAM
ST Employee

You are overthinking this a little bit. We had a choice to return the acutal distance or do the angle compensation. We chose to compensate.

We test by putting our sensors perpendicular to a wall at some distance N, and the returned distances should all be N. If you want, you can undo the trig we used and get back to the outside zones being longer.

We added this bit simply because the cell phone camera guys (who by really large numbers of sensors) wanted it this way. They only get one focal length.

And yes. In the extremely unlikely case you had a room with just the right radius curve, you would get shorter distances for the outside zones.

The VL53L7 we are coming out with next week has an even wider FoV. But we pull the same trick of insuring all the zones will have the same distance when viewing a wall.

Time of flight measures distances by timing the light as it goes out and back. The only real trick is one needs amazingly sensitive detectors (we use an array of Single Photon Avalanche Diodes behind a lens) and of course one has to be quick.


In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question. It helps the next guy.

View solution in original post

12 REPLIES 12
Anne BIGOT
ST Employee

Hello,

Your question has been raised internally. We will come back soon to you.

Best regards

Anne


In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.
Inq
Associate II

Since I can't post links here yet, would it be of help to send links documenting our data and procedures to your technical support directly? We're up to 14 pages of posts for our VL53L5CX thread. As is common on forums a lot of it is just noise, people just talking, having fun, speculating. I can provided specific posts showing the Trigonometry theory we're using and actual data we're gathering.

Thank you for your time.

VBR,

Inq

Anne BIGOT
ST Employee

Hello

Our supposition is that you are expecting a radial distance while the sensor returns the perpendicular distance. According to the datasheet, you should have +/-5% accuracy for all zones (meaning 5cm at 1 meter). Your measurements are within the datasheet limitations.

Hope his answer your question

Anne


In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.
John E KVAM
ST Employee

The answer is simple. We know the problem and we correct for it inside the sensor. When you are perpendicular to the wall at 1M, you should get 64 zones all saying about 1M.

(There is a term for the correction, but I just cannot think of it at the moment.)

  • john

In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question. It helps the next guy.
Inq
Associate II

@John E KVAM​ , @Anne BIGOT​ 

I was afraid that I missed something very critical in reading your documentation. I found that in fact I did not have the latest documents. I downloaded the newest and read every word in them. Not once do they describe this VERY CRITICAL CONCEPT you are describing!

I think any logical person would assume that taking a single laser ToF sensor and waving it around within the VL53L5CX's 45° FoV would expect it to return the absolute distance from the sensor to the object. IOW the hypotenuse. We would then use simple trig to do the 3D transformations to get the adjacent and opposite sides to place them in space. Following... more transforms would be applied to place them in a global point cloud... aka SLAM.

0693W00000SwLLDQA3.png 

Why would this 8x8 sensor report data any differently?... in essence returning the adjacent side in the drawing? Thus I have to do calculations to tell ITS ACTUAL DISTANCE in case it is about to run into the wall! And more importantly, why would it not be documented ANYWHERE?

I have to admit I do not understand how this device actually measures ToF. I have not been able to find any documentation or white paper describing the concept. I did read that the Integration time also describes how long the laser is on during a reading. Obviously, there is no way to tag a single photon leaving when the laser first turns on and actually time its return as compared to some other photon leaving later or one from ambient light.

So... is this trigonometry slight of hand you are describing some artifact of the measuring concept or was it explicitly added in the firmware of the sensor? And if so... WHY? I have yet to discern any benefit. If anything... it flies in the face of logic and makes the sensor unusable (without this knowledge of the Trig). We on the robotics forum are about to chuck this sensor as being less accurate than even a simple ultrasonic ToF sensor at less than $1.

What use case actually benefits from this concept of returning a value that has no real-world significance?

Apologies for being a little snippy - But you're documents go into excruciating detail about voltages and startup times, and everything about the electronic side, yet totally glosses over any Time of Flight theory and/or practical usage cases.

VBR,

Inq

Inq
Associate II

So... if all the returning distances are the same distance to a flat wall. What is that distance... the closest one, furthest one, average one? What would be the purpose of a multi-cell distance sensor that returns only the same number in this situation? In the drawing below... what would the sensor return if it is in the center of a circular room (Purple area)? I would expect it to be the same distance for all cells (+/-5%). If they are all the same, how do I know if the sensor is looking at a flat wall or a circular wall... or a spherical wall?

0693W00000SwMhoQAF.png

No... you should not get the same value. In the real world, the distances are different as you move away from the centerline. The sensor artificially correcting makes the data invalid and unusable. Please see the following posts for details of this failure.

John E KVAM
ST Employee

You are overthinking this a little bit. We had a choice to return the acutal distance or do the angle compensation. We chose to compensate.

We test by putting our sensors perpendicular to a wall at some distance N, and the returned distances should all be N. If you want, you can undo the trig we used and get back to the outside zones being longer.

We added this bit simply because the cell phone camera guys (who by really large numbers of sensors) wanted it this way. They only get one focal length.

And yes. In the extremely unlikely case you had a room with just the right radius curve, you would get shorter distances for the outside zones.

The VL53L7 we are coming out with next week has an even wider FoV. But we pull the same trick of insuring all the zones will have the same distance when viewing a wall.

Time of flight measures distances by timing the light as it goes out and back. The only real trick is one needs amazingly sensitive detectors (we use an array of Single Photon Avalanche Diodes behind a lens) and of course one has to be quick.


In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question. It helps the next guy.

Thank you for your quick reply. I can fully appreciate catering to the millions instead of the minority. Just for my edification...

  1. So... it is consistent in using that algorithm. Your earlier sentence can be read to mean only when looking at the a perpendicular wall... as if it was a special use case. The numbers were looking reasonable (not great but reasonable) when looking at walls obliquely.. The %error calculated for those cases will surely improve. We can work with this and back it out.
  2. Would it not be possible to make it configurable like so many other things? It wouldn't have to be exposed at the API level like integration time and sharpness, but it would be nice to have in the header files like VL53L5CX_NB_TARGET_PER_ZONE or VL53L5CX_USE_RAW_FORMAT? For all I know... this may be it. I'm not quite clear on what the differences would be between the "firmware format and the user format."
  3. Is there a detailed white paper on the ToF theory? Some on the forum understand Physics very well (even Masters or Doctorate level). The simplistic wording, "Time of flight measures distances by timing the light as it goes out and back." isn't really valid. You can't tag a photon and you can't distinguish between a photon that left at the beginning of the integration time sequence and one that left at the end and you can't tell if it is merely ambient light of the same wavelength. We been throwing around all kinds of theories from diffraction patterns to interference of outgoing versus incoming photons/waves. The Quantum Mechanics quandary.

Again, apologies for the snip. We've spent well over two months on this and were getting to the point of chucking it all.

VBR,

Inq