Showing results for 
Search instead for 
Did you mean: 

the detection rate of SPAD products (VL53L3X)

Associate II

I'm doing tests on the VL53L3X evaluation board, and trying to reproduce the detection rate issues.

In fact, it is hard to get the 50% and 94% valid outcomes out of

certain large number of tests using the quoted reflection material, ambient

light condition, etc.

If I understood correctly, the detection rate is defined by the

ratio between the 0-RangeStatus tests and the total number, right?

Is there any algorithm to predict the detection rate based on

some experimental parameters? For example, the signal/ambient rate, or any

other ones?

Hope to get responded, thanks!

ST Employee

Yes - the detection rate is defined by the ratio between the 0-RangeStatus tests and the total number.

There are several conditions that affect detection rates:

1) target size. In our tests the target was large enough to cover the full field of view. At distance M the diameter of the circle of illumination is 1/2 M.

(This is most likely the culprit in your testing. But that's just a guess.)

2) Reflectivity of the target - this is defined in our tests. We also use a matte finish. Specular (mirror like) gloss finishes work better, but only if exactly perpendicular.

3) Ambient light - Sunlight (and other 940nm interferers) can seriously affect the range. In our tests there was none.

4) Timing budget - with more time we can receive more light, and thus do a better job of detection.

You should be able to reproduce our results.

But there are a few things you can do to affect the rate.

Demanding a higher signal rate will reduce the number of 0-range status results. You get a warning that your signal rate is low and might not have an accurate range.

(decreasing this number will return more 0-status results.)

Sigma level - increasing this will reduce the number of Sigma fail status error. You are allowing a wider standard deviation in the range results.

You can treat the Sigma status and the Signal status results as warnings and get a higher detection score, without actually changing the values.

This gives you the data, and some extra information about the quality of the data.

  • john

In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question. It helps the next guy.
Associate II

Many thanks for your kindly reply.

However our experimental results can not provide the quoted detection rate in the datasheet as below.

0693W00000GWsnKQAT.pngWe did our tests in a dark room environment (50Lux on average with normal fluorescent lamp), and set the white wall as the target. The width and height of the wall are both larger than 2.5m for our 5m range tests, which should make sure the "target" qualified. The distance between the evaluation board and the target is allocated from 70mm to 5000mm with different step length. And we recorded 200 report values provided by your software at each distance, including the RangeStatus, signal/ambient rate, etc. Thus a detection rate changing with target distance should be generated as plotted below, along with the mean signal/ambient rate.

0693W00000GWsr2QAD.pngThe ratio between 0-RangeStatus and total test number per each distance stays almost 100%! That's why I said the 50% and 94% detetction rate are hard for me to reproduce. Yes, your datasheet declares those two values are in the worst case. But an almost 100% detection rate at any given distance can not stop me having doubts on my experimental procedures.

As for the outdoor overcast senario, we did experiments either. But as one can expect, the sunlight can not controlled on will, so we just took data under ~7.kLLux and ~25kLux condition. The experimental detection rate actually changed from 100% to below 50% or even 0, but the 50% and 94% distance points deduced from linear interpolation can not match with the table, not to mention the impact from the uncontrollable sunlight strength.

Intuitively speaking, the detection rate must have some relation with the signal and ambient rate, and some other parameters which are not provided by your software. So I'm wondering if there are algorith or function to predict the final detection rate based on the experimental values for you to provide limits on the datasheet, as illustrated in the quoted table.

Many thanks!

ST Employee

Wow, that's some serious work.

LUX does not matter. Lux is a measure of visible light. And it does not include 940nm. However in sunlight, knowing the Lux level, one can guess at the 940nm using a graph of sunlight. (There is less 940nm in sunlight because a lot was absorbed by the water vapor. It's one reason to choose 940nm.)

If you didn't get our results in your lab, then your florescent tube is generating 940nm. And when you are pushing the limits, that bit of ambient makes a difference.

Switch to LED lighting and your ambient will go to 0.

Getting outside data is almost impossible. Not only is brightness an issue, but direction is too. You get different answers based on the direction the sun is to the sensor and the target. I couldn't even begin to design the experiment.

In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question. It helps the next guy.
Associate II

Thanks for your advice!

In fact, we are concerning that why a LOW detection rate result can not be acquired in our indoor lab. It's critical for us to reproduce the 94%, 50% and even lower values, not an always nearly 100% 0-RangeStatus.

As described above, within 5m range in the indoor lab, we got the detection rate values always near 100%, so we are confused the meaning of "typical: 310cm @94% min; minimum: 310cm @50% min" quoted from datasheet. If these rate values are overestimated, then why public such corresponding distances?

Is it possible for us to know under what experimental condition do you get the 50% and 94% values?

Sorry to bother you time after time, your replies mean a lot for me!