The specifications of the sensor states a "FOV of 25 degrees". I'm not really sure what that FOV means here. The beam is reflected back to the detector and filtered by a pinhole. There's a pinhole, so what would be the point of such a high divergence, if that is what FOV means?
If "FOV" is the divergence and if the divergence is artificially increased so it will reach the pinhole which is at an offset to the emitter it seems too much much even for few cm distances, right?
I also do not see how the high divergence will help detect objects at other angles (provided this is the purpose) besides the one the pinhole is pointing at as any reflected ir light with a slightly different angle will be physically cut and not reach the sensor inside the pinhole. Even if there wasn't a pinhole any non retroeflective surface not in the center of that FOV still wouldnt work unless it was in the middle of the "FOV", in my understanding.
My only explanation why there is a large divergence is simply adding a collimating lens in front of the laser was not needed as the laser was bright enough to work without it and would just add to the cost. But then the question arises why not have a pinhole in front of the laser emitter in the first place and why have a "FOV of 25" specified in the specifications if it doesn't mean anything. Plus this still may give wrong readings for retroreflective surfaces as many beams from very different distances and any angle in the "25 degree FOV" may manage to travel back to the pinhole.
So I don't understand this "25 degree FOV" and its purpose.