Three Industrial LiDAR Implementations
Publish:Box Optronics  Time:2022-11-28  Views:639

1. Direct Time of Flight (dToF):

In the time-of-flight approach, system manufacturers use the speed of light to generate depth information. In short, directed light pulses are fired into the environment, and when the light pulse hits an object, it is reflected and recorded by a detector near the light source. By measuring the time it takes for the beam to reach the object and return, the object distance can be determined, while in the dToF method the distance of a single pixel can be determined. The received signals are finally processed to trigger corresponding actions, such as vehicle evasion maneuvers to avoid collisions with pedestrians or obstacles. This method is called direct time-of-flight (dToF) because it is related to the exact "time-of-flight" of the beam. LiDAR systems for autonomous vehicles are a typical example of dToF applications.

2. Indirect Time of Flight (iToF):
The indirect time-of-flight (iToF) approach is similar, but with one notable difference. Illumination from a light source (usually an infrared VCSEL) is amplified by a dodging sheet and pulses (50% duty cycle) are emitted into a defined field of view.

In the downstream system, a stored "standard signal" will trigger the detector for a period of time if the light does not encounter an obstacle. If an object interrupts this standard signal, the system can determine the depth information of each defined pixel of the detector based on the resulting phase shift and the time delay of the pulse train.

3. Active Stereo Vision (ASV)

In the "active stereo vision" method, an infrared light source (usually a VCSEL or IRED) illuminates the scene with a pattern, and two infrared cameras record the image in stereo.
By comparing the two images, downstream software can calculate the required depth information. Lights support depth calculations by projecting a pattern, even on objects with little texture such as walls, floors, and tables. This approach is ideal for close-range, high-resolution 3D sensing on robots and automated guided vehicles (AGVs) for obstacle avoidance.