cancel
Showing results for 
Search instead for 
Did you mean: 

Effect of "timing butget" using VL53L1CX

MKlag.1
Associate

I use the "short mode" and set the timing butget to 100 ms. Every second I start a new measure with VL53L1X_StartRanging. With VL53L1X_GetRangeStatus() I poll for a new measure. But when I measure the time that is needed for a range it is only 20 ms. When I check this in ULD example I see the same, but the time is here 50 ms. Have I missunderstood something wrong?

2 REPLIES 2
John E KVAM
ST Employee

20ms is kind of the default. So something went wrong with your setting.

Theory is that you do the 'init', you change the settings, and then issue the 'go' command.

Did you check the status return on all your calls? (not everyone does.)

but the TimingBudget does not really depend on the distance mode. So that's not it.

Compare your code to the example code and try to figure out what is different about yours.

But if you want to range once a second for 100ms, you set the Timing Budget to 100ms and the intermeasurement period to 1 second.

You only need to start it once, it will run until you send a stop command, days or weeks from later - or never.


Our community relies on fruitful exchanges and good quality content. You can thank and reward helpful and positive contributions by marking them as 'Accept as Solution'. When marking a solution, make sure it answers your original question or issue that you raised.

ST Employees that act as moderators have the right to accept the solution, judging by their expertise. This helps other community members identify useful discussions and refrain from raising the same question. If you notice any false behavior or abuse of the action, do not hesitate to 'Report Inappropriate Content'
MKlag.1
Associate

Thanks, you for the fast respons. You are right, changing the intermeasurement period to 1 second makes sence. I changed the code so that I now poll on the gpoi interrupt pin whithout using VL53L1X_GetRangeStatus() and I see the right timing.