2022-09-17 09:32 AM
I am on a robotic forum where we have been studying using this VL53L5CX for robotic vision, mapping and SLAM for the last two months. We are on page 12 of this thread: "<link provided if requested and I'm permitted>"
The problem we have run into concerns the distance readings we're getting out of it. The problem is explained in detail in the second half of this post "<link provided if requested and I'm permitted>" with pictures and previous posts to that thread give detailed output from the sensor.
Since I am new to this forum and can't provide the links, I'll summarize the problem here:
Theory says, if the center is 1000 mm away, the distance to the center of all four corner pixels should be ~1112 mm away. The sensor is only returning values close to 1018 mm away. IOW, Not anywhere near close to theory. We have tried many different settings for Integration Time (5 to 100 ms) and Sharpness Percent (0 to 99%).
Because it is SO wrong, I feel like I'm missing something fundamental here. Maybe someone here can enlighten me. No one on our forum has seen a problem with our logic and expectations.
Thank you for your time.
VBR,
Inq
Solved! Go to Solution.
2022-09-27 08:29 AM
You are overthinking this a little bit. We had a choice to return the acutal distance or do the angle compensation. We chose to compensate.
We test by putting our sensors perpendicular to a wall at some distance N, and the returned distances should all be N. If you want, you can undo the trig we used and get back to the outside zones being longer.
We added this bit simply because the cell phone camera guys (who by really large numbers of sensors) wanted it this way. They only get one focal length.
And yes. In the extremely unlikely case you had a room with just the right radius curve, you would get shorter distances for the outside zones.
The VL53L7 we are coming out with next week has an even wider FoV. But we pull the same trick of insuring all the zones will have the same distance when viewing a wall.
Time of flight measures distances by timing the light as it goes out and back. The only real trick is one needs amazingly sensitive detectors (we use an array of Single Photon Avalanche Diodes behind a lens) and of course one has to be quick.
2022-09-22 01:24 AM
Hello,
Your question has been raised internally. We will come back soon to you.
Best regards
Anne
2022-09-22 05:45 AM
Since I can't post links here yet, would it be of help to send links documenting our data and procedures to your technical support directly? We're up to 14 pages of posts for our VL53L5CX thread. As is common on forums a lot of it is just noise, people just talking, having fun, speculating. I can provided specific posts showing the Trigonometry theory we're using and actual data we're gathering.
Thank you for your time.
VBR,
Inq
2022-09-26 05:23 AM
Hello
Our supposition is that you are expecting a radial distance while the sensor returns the perpendicular distance. According to the datasheet, you should have +/-5% accuracy for all zones (meaning 5cm at 1 meter). Your measurements are within the datasheet limitations.
Hope his answer your question
Anne
2022-09-26 08:29 AM
The answer is simple. We know the problem and we correct for it inside the sensor. When you are perpendicular to the wall at 1M, you should get 64 zones all saying about 1M.
(There is a term for the correction, but I just cannot think of it at the moment.)
2022-09-27 04:10 AM
@John E KVAM , @Anne BIGOT
I was afraid that I missed something very critical in reading your documentation. I found that in fact I did not have the latest documents. I downloaded the newest and read every word in them. Not once do they describe this VERY CRITICAL CONCEPT you are describing!
I think any logical person would assume that taking a single laser ToF sensor and waving it around within the VL53L5CX's 45° FoV would expect it to return the absolute distance from the sensor to the object. IOW the hypotenuse. We would then use simple trig to do the 3D transformations to get the adjacent and opposite sides to place them in space. Following... more transforms would be applied to place them in a global point cloud... aka SLAM.
Why would this 8x8 sensor report data any differently?... in essence returning the adjacent side in the drawing? Thus I have to do calculations to tell ITS ACTUAL DISTANCE in case it is about to run into the wall! And more importantly, why would it not be documented ANYWHERE?
I have to admit I do not understand how this device actually measures ToF. I have not been able to find any documentation or white paper describing the concept. I did read that the Integration time also describes how long the laser is on during a reading. Obviously, there is no way to tag a single photon leaving when the laser first turns on and actually time its return as compared to some other photon leaving later or one from ambient light.
So... is this trigonometry slight of hand you are describing some artifact of the measuring concept or was it explicitly added in the firmware of the sensor? And if so... WHY? I have yet to discern any benefit. If anything... it flies in the face of logic and makes the sensor unusable (without this knowledge of the Trig). We on the robotics forum are about to chuck this sensor as being less accurate than even a simple ultrasonic ToF sensor at less than $1.
What use case actually benefits from this concept of returning a value that has no real-world significance?
Apologies for being a little snippy - But you're documents go into excruciating detail about voltages and startup times, and everything about the electronic side, yet totally glosses over any Time of Flight theory and/or practical usage cases.
VBR,
Inq
2022-09-27 08:06 AM
So... if all the returning distances are the same distance to a flat wall. What is that distance... the closest one, furthest one, average one? What would be the purpose of a multi-cell distance sensor that returns only the same number in this situation? In the drawing below... what would the sensor return if it is in the center of a circular room (Purple area)? I would expect it to be the same distance for all cells (+/-5%). If they are all the same, how do I know if the sensor is looking at a flat wall or a circular wall... or a spherical wall?
2022-09-27 08:21 AM
No... you should not get the same value. In the real world, the distances are different as you move away from the centerline. The sensor artificially correcting makes the data invalid and unusable. Please see the following posts for details of this failure.
2022-09-27 08:29 AM
You are overthinking this a little bit. We had a choice to return the acutal distance or do the angle compensation. We chose to compensate.
We test by putting our sensors perpendicular to a wall at some distance N, and the returned distances should all be N. If you want, you can undo the trig we used and get back to the outside zones being longer.
We added this bit simply because the cell phone camera guys (who by really large numbers of sensors) wanted it this way. They only get one focal length.
And yes. In the extremely unlikely case you had a room with just the right radius curve, you would get shorter distances for the outside zones.
The VL53L7 we are coming out with next week has an even wider FoV. But we pull the same trick of insuring all the zones will have the same distance when viewing a wall.
Time of flight measures distances by timing the light as it goes out and back. The only real trick is one needs amazingly sensitive detectors (we use an array of Single Photon Avalanche Diodes behind a lens) and of course one has to be quick.
2022-09-27 02:57 PM
Thank you for your quick reply. I can fully appreciate catering to the millions instead of the minority. Just for my edification...
Again, apologies for the snip. We've spent well over two months on this and were getting to the point of chucking it all.
VBR,
Inq