Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Demonstration of real-time structured-light depth sensing based on a solid-state VCSEL beam scanner

Open Access Open Access

Abstract

We demonstrated a real-time scanning structured-light depth sensing system based on a solid-state vertical cavity surface-emitting laser (VCSEL) beam scanner integrated with an electro-thermally tunable VCSEL. Through a swept voltage added to the tunable VCSEL, a field of view of 6°×12° could be scanned with a scanning speed of 100 kHz by the beam scanner. Adopting the beam scanner, the real-time depth image with a lateral resolution of 10,000 (20×500) was obtained by measuring a step target placed at 35cm. The frame rate could be >10Hz even if sunlight shot noise is artificially added to the experimental data. By using a higher-speed camera, a potential lateral resolution could be reached at 50,000 (100×500) with a frame rate of > 20Hz. By using flat optics, a compact scanning module offering line pattern with FoV of >40°×20° was also demonstrated. It could help to realize high-resolution and high-accuracy structured-light sensing with a compact module.

© 2021 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Structured-light sensing [1] has been widely applied in face identification, household robot and industrial manufactory robot thanks to its ultrahigh depth accuracy in short distance. The research on the high-performance light source may bring structured-light sensing to next generation. Flash VCSELs and VCSEL array [2,3] equipped with diffractive optical element (DOE) [46] are mostly adopted to generate randomly coded dot-patten structured light. Power consumption is a problem for such flash structured-light sensing, especially critical for application in mobile devices. Besides, the ideal lateral resolution of depth images should be equal to the pixel number of a camera. However, random dot patterns could hardly achieve ideal lateral resolution because its encoded unique feature is much larger than pixel size [7]. Binary coded [8] or phase coded [9] line patterns could simplify the coding complexity and possibly realize pixel-limited lateral resolution, but their problem is a difficulty in the compact-size manufacturability of a light projector.

A beam scanner based structured-light sensing cooperating with a high-speed camera provides an alternative for solving these problems. Compared to a flashlight projector, a beam scanner has larger light intensity to enable lower power consumption with the same depth accuracy. Because only a single beam was captured in a single shot in designed order, pattern coding could be significantly simplified, and the pixel-limit lateral resolution becomes achievable when the shutter speed of a camera is high enough. The beam scanner based on mechanical or MEMS mirrors has been commercially used in time of flight (ToF) sensing [1012]. Its large size and concern in long-term reliability of moving parts preclude the application in structured-light sensing. Compared to MEMS mirror beam scanners, solid-state non-mechanical beam scanner has potentially better stability and long-term reliability when suffering from big strike or large vibration. Beam scanners based on an optical phase array [1315], focal plane switch array [16,17] and photonic crystal [18] has been demonstrated, however, the number of resolution points is not large enough for practical applications. Also, the requirement of an external light source is another barrier for compact package.

Previously, we reported a solid-state beam scanner based on VCSEL waveguide/amplifier [1921] It offers a narrow line beam with divergence of < 0.1$^\circ $, small device size of <0.05mm2 and high output power of > 3W [22]. It could also achieve high-speed (>100kHz) electro-driven beam steering if integrated with a tunable VCSEL [23]. It also has a potential for large-scale manufacturability with lower costs thanks to well-established 6-inch wafer VCSEL production platforms. Thus, it provides a cost-effective way to realize real-time beam-scanning structured-light sensing with a pixel-limit lateral resolution and low power consumption, which will be demonstrated in this paper. Instead of single line per shot, multi-lines per shot beam scanning will also be proposed in Sec. 6. It will greatly increase the field of view (FoV) and relax the requirement of a camera shutter speed for higher lateral resolutions.

2. Characteristics of beam scanner integrated with a tunable VCSEL

The schematic of beam scanner integrated with a tunable VCSEL is illustrated in Fig. 1 (a). The device was fabricated based on a half-cavity VCSEL wafer with a detuning layer. The beam scanner has a similar structure to that of conventional 850 nm VCSELs, where DBRs sandwiches the active region to form an active slow-light waveguide. The slow light beam is emitted from the emission window of a top DBRs that consists of 4-pair semiconductor DBRs and 7.5-pair Ta2O5/SiO2 dielectric DBRs [23]. Thanks to a large angular dispersion of incident light with different wavelength, the output slow-light beam could be steered. In details, the relation between the wavelength of incident light $\lambda $, the resonant wavelength ${\lambda _c}$, the refractive index of top DBR ${n_{wg}}$ and deflection angle of the output beam ${\theta _r}$ is expressed by Eq. (1) [19],

$$\textrm{sin}{\theta _r} = {n_{wg}} \times \sqrt {1 - {{\left( {\frac{\lambda }{{{\lambda_c}}}} \right)}^2}} $$

Because the beam scanner also has the active region, when the current was injected into the beam scanner, the radiation loss and absorption loss could be compensated, resulting in a uniform intensity distribution along the propagation direction. The output power could be increased in proportion to the scanner length and a VCSEL scanner functions as an optical amplifier [20]. Isolated by proton ion implantation, a tunable VCSEL was laterally integrated with the beam scanner. [24]. Through varying the current injected to the VCSEL ${I_{VCSEL}}$, the wavelength of a tunable VCSEL could be tuned thanks to self-heating. Because the beam scanner works as a slow-light waveguide, only the light with wavelength smaller than the resonant wavelength of beam scanner could propagate in the beam scanner by slow-light mode. Therefore, we need to do wet etching by about 30∼50 nm at 4 pairs above the active region at VCSEL side to make the lasing wavelength of VCSEL blue shift by 8nm∼10 nm and then deposit the dielectric DBRs for enough reflectivity.

 figure: Fig. 1.

Fig. 1. (a) The schematic of beam scanner integrated with a tunable VCSEL (b) Reflected pattern of device (c) Output power of device (d) Beam scanning frequency response

Download Full Size | PDF

The length of the integrated device is only 1mm as shown in the top-view photo of Fig. 1, where the golden region stands for electrodes for the tunable VCSEL and beam scanner. The output pattern reflected by a planar object placed 35cm away from the device and output power of the device were measured as shown in Fig. 1(b) and 1(c), respectively. In Fig. 1(b), the injection current to the beam scanner was fixed at 40 mA and by tuning ${I_{VCSEL}}$ from 1.5 mA to 5.5 mA, the wavelength of VCSEL could be tuned by 3 nm. Because the scanning efficiency of our beam scanner is around 2$^\circ $/nm [25], the beam scanning range in $\theta $ direction of >6$^\circ $ was obtained. In $\phi $ direction FoV of 12$^\circ $ could be covered. Non-uniform light intensity distribution could be also observed from the image due to speckle pattern resulting from reflection by a rough surface of the object. Figure 1(c) shows the scanner current-output characteristic measured when ${I_{VCSEL}}$ was 3.5mA. Though the power was saturated at 60mA, in structured-light sensing, the maximum current was chosen to 40mA, because ASE and scanner self lasing has an influence on the depth accuracy of structured-light sensing when ${I_{VCSEL}}$>40mA. It is noted that the effect of ASE could be reduced by increasing the coupling power from the VCSEL and thus the output power from the scanner could be largely increased up to 3W with a high beam quality [22].

3. Setup and theory of real-time structured-light depth sensing

The schematic of the proposed real-time structured-light sensing system is illustrated in Fig. 2. It basically consists of a beam scanner to project a line pattern to a target, a CMOS camera used to capture the reflected pattern, and a target with a step formed by two white boards to be detected. When the line pattern generated by beam scanner strikes the target, it will be reflected by the target and received by the camera. Through the way in which line pattern distorts, the depth information of a target object could be obtained [26,27].

 figure: Fig. 2.

Fig. 2. Schematic of real-time structured-light sensing system

Download Full Size | PDF

The part outside the red frame is used to control the synchronization of currents injected to a tunable VCSEL and beam scanner, and the shutter exposure of a CMOS camera. The pulse generator 1 functions to produce a clock signal that is separated to two paths, in one of which the clock signal triggers a function generator to generate an up-stair voltage signal that drives wavelength tuning of the VCSEL. In the other path, the clock signal is transferred to pulse generator 3 and make it produce a pulse train with a repletion rate of N times as large as clock signal, where N is the number of scanning lines in a single frame. This signal will also be separated to pump the beam scanner and trigger shutter exposure, respectively, where the pulse width of an injection current into the scanner should be same as shutter exposure time. It is noted that to make use of scanning function, each line should be exposed in an independent shutter. Thus, the number of lines per second ${N_{total}}$ could be calculated from the shutter speed of camera ${f_{camera}}$, the number of lines per frame N and beam scan speed ${f_{scan}}$ as Eq. (2).

$${N_{total}} ={f_{camera}} = N{f_{scan}}$$

In our measurement, we must set a reference plane (as shown in Fig. 2 shows) to avoid the effect of bending of our line patterns. The reference plane should be placed close to the target depth to improve the accuracy, or the calibration should be completed for a larger range. The reference plane was used to record the original pattern illuminated by proposed beam scanner.

Once the original pattern reflected by the reference plane was recorded and saved when doing calibration, the reference plane was not needed anymore in practical measurement. We could obtain the relative depth of target and reference plane ${\boldsymbol d}$ through the displacement of a corresponding wavelength ${{\boldsymbol \lambda }_{\boldsymbol n}}$ between the received line pattern reflected by the target and by the reference plane. Then, the depth information of the target could be obtained by Eq. (3),

$$d = \frac{{\Delta y{D^2}}}{{fb \pm \Delta yD}}$$
where $\Delta {\boldsymbol y}$ is the displacement in y of line beam reflected by the target from beam reflected by the reference plane, ${\boldsymbol f}$ is the focal length of a CMOS camera and ${\boldsymbol b}$ is the distance between the scanner and CMOS camera (length of baseline), ${\boldsymbol D}$ is the distance from baseline to the reference plane at original position, which could be calibrated by moving the reference plane step by step in the way shown in [26]. As for the operator in denominator, if the object is set farther than the reference plane away from a CMOS camera, the operator should be minus, otherwise the operator should be plus. If the object has a step like as shown in Fig. 2 and are settled perpendicular to CMOS camera, the depth of the step Δd for wavelength ${{\boldsymbol \lambda }_{\boldsymbol n}}$ could be obtained by Eq. (4),
$$\Delta d = \frac{{\Delta y{2_{{\lambda _n}}}\; {D^2}}}{{fb \pm \Delta y{2_{{\lambda _n}}}D}} - \frac{{\Delta y{1_{{\lambda _n}}}{D^2}}}{{fb \pm \Delta y{1_{{\lambda _n}}}D}}$$
where $\Delta {\boldsymbol y}{2_{{{\boldsymbol \lambda }_{\boldsymbol n}}}}$ is the displacement of behind plane and $\Delta {\boldsymbol y}{1_{{{\boldsymbol \lambda }_{\boldsymbol n}}}}$ is the displacement of front plane from the reference plane when the operating wavelength is ${{\boldsymbol \lambda }_{\boldsymbol n}}$.

4. Experimental result of real-time structured-light depth sensing

The purpose of this section is to show the compatibility of VCSEL beam scanner in structured-light sensing, a simple calibration method and algorithm were adopted. The real-time structured-light sensing was carried out with parameters shown in Table 1. The target was placed 35cm away from the camera and the thickness of a front white board was 3mm, which means the step height was 3mm. The $b,f,D$ was calibrated by moving the reference plane from 32cm (original position) to 38cm with a step of 1mm in the z direction. However, this simple calibration method may lead to some systematical error. The error correction procedure is as following: 1) The reference plane was moved with step of 0.5mm from 33cm to 37cm in the z direction; 2) The depth at each position was calculated by using the calibrated $b,f,D$. 3) Make the training curve by using the calculated depth and real depth [26].

Tables Icon

Table 1. The basic experimental condition

The single-frame measurement result was illustrated in Fig. 3. The overlapped line pattern reflected by the target of 20 shutters was shown in Fig. 3(a), from which the breakage due to the boundary of two boards could be clearly observed. As shown in Fig. 3(a), the pixel number covered by light in the x direction is 500, so the resolution in the x direction is 500. However, the resolution in the y direction should be limited by the number of line beams, because we only illuminate limited numbers of line beams to the object. We could also see the space between lines, where there is no depth data obtained. Besides, the depth data will be obtained at only one pixel along the y direction at each x of each line beam, even if the line beam may cover serval pixels in the y direction. Therefore, the total resolution number should be the number of line beams per image N$\times $ the number of pixels in the x direction. In this experiment, N is 20 and pixel number covered by the line in the x direction is 500, so the lateral resolution of an image is 20 × 500 = 10,000. From the observed image, the depth information could be obtained as shown in Fig. 3(b) by using Eqs. (3) and (4) and error correction, where different color stands for depth steps between front and back boards. This depth step could be estimated though subtracting depth of blue part by yellow part. It could be found that in the x direction, there is a little breakage at about x = 380 because of the boundary of behind and front plane. In the y direction at the boundary, there is also a little misalignment due to the depth difference between front and behind plane. Theoretically, the larger depth difference will cause larger misalignment in the y direction. Because the measurement was completed indoors and an optical filter (its passband width of 800∼950 nm) was applied, the accuracy due to shot noise could be neglected. However, it is noted the speckle pattern generated due to a rough surface may strongly affect the measured depth, because the speckle pattern will greatly change the light intensity distribution in pixel scales [2830]. The speckle pattern originates from the interference at the CMOS camera when coherent light is reflected by a rough surface. It highly depends on the wavelength-scale structure of a target surface, so the variation of surface roughness will lead to the variation of measured depths. To further evaluate the variation due to speckle patterns, the target was laterally (in the x direction) moved for different measurements to simulate the roughness variation. A fixed position A as shown in Fig. 3(a) on the depth image (x = 300 of the 14th line beam) was chosen as our accuracy analyzing object. We firstly moved the target slightly in the x direction to simulate the surface roughness variation, and then recorded the measured depth of position A. The same procedure was repeated for 20 times, and we could obtain 20 depth data of position A. Finally, the accuracy could be estimated by calculating the standard deviation of these 20 data as 270$\mu $m. We also measured the depth accuracy of another 3 positions with different light intensity: position B (x = 80 of 12th line,), position C (x = 80 of 20th line), position D (x = 300 of 20th line). The depth accuracy is 266$\mu $m, 311$\mu $m and 276$\mu $m, respectively. The variation of these three points is not so large because the background noise is not strong, and speckle (without relation to power) effect the accuracy dominantly.

 figure: Fig. 3.

Fig. 3. (a) Overlapped reflected pattern of target (b) Depth information of target (c) The depth information of specific line (14th line)

Download Full Size | PDF

The continuous multi-frame depth sensing was measured under the same condition as single-frame depth sensing. The target was slightly moved laterally in 1 second, and because scan speed was 10Hz, 10 continuous frames were captured to obtain the depth information as shown in Fig. 4. It could be seen from the figure that the depth sensing performance remains stable in real time measurements.

 figure: Fig. 4.

Fig. 4. Real-time measurements of 10 frames of depth data measured in 1 second

Download Full Size | PDF

5. Accuracy analysis under different measurement conditions

Using the same beam scanner, the CMOS camera, and a planar target, we investigated the depth accuracy dependence on the output power of the beam scanner and on the distance from a target to the camera. An injection current to the VCSEL is fixed at 3.5mA. The accuracy (standard deviation) was estimated by measuring a fixed data point in continuous 50 frames. The target was laterally moved simultaneously as explained in Section 4 to simulate the variation of target surface to include the effect of speckle patterns. Because measurements were carried out indoors, the background light was too weak to affect the performance of depth sensing. We varied the injected current to the beam scanner from 20mA to 40mA, which is corresponding to the scanner output of 0.3mW to 1mW, respectively. When we measured the target placed at 35cm, 70cm and 100cm, the depth accuracy is almost unchanged as shown in Fig. 5(a).

 figure: Fig. 5.

Fig. 5. (a) Accuracy dependence on power (b) Accuracy dependence on distance

Download Full Size | PDF

The main noise source is speckle pattern that will distort the intensity distribution of line beam profiles so seriously that the result of depth sensing fluctuates [28,29,31]. Figure 5(b) shows the measured depth accuracy for different target distances when 40mA current was injected to the beam scanner. It shows relative accuracy (depth accuracy/distance) has strong dependence on the distance due to triangulation of imaging systems [28], which is why structured-light sensing technology is only applied in short-range detection.

To simulate practical environments of outdoor, the effect of sunlight is difficult to be avoided. The received electrons in a single pixel of a CMOS camera stimulated from sunlight reflected by the target ${B_{solar}}$ was estimated as Eq. (5) [32,33].

$${B_{solar}} = \frac{{\eta pt\rho {s^2}}}{{4{F^2}hc}}\mathop \int \nolimits_{800}^{950} \lambda I(\lambda )d\lambda $$

The parameters used in the Eq. (5) and our assumed values are shown in Table 2. Because our camera has an 800∼950 nm band-pass filter in the experiment, a 800∼950 nm band-pass filter was assumed in the calculation. For practical applications, thanks to the good wavelength stability of VCSEL, a 10 nm band-pass filter could be preferable. It will also suppress the effect of sunlight greatly. Based on the data in the Table 2, the background sunlight was calculated as ${B_{solar}} = $ 15,700${e^ - }$, but because the constant background light is easily subtracted by image process, only the shot noise of sunlight could affect the depth accuracy. The shot noise of sunlight ${n_{shot}}$ could be estimated by Eq. (6) [34]: The result was calculated as ${n_{shot}}$ =2.5. In our experiment, a gaussian shot noise with standard deviation of ${n_{shot}}$ generated by MATLAB was numerically added to the previous received images.

$${n_{shot}} = \tau \sqrt {{B_{solar}}} $$

Tables Icon

Table 2. Parameters for estimation of sunlight received by camera

After adding the sunlight shot noise to the 50 captured frames, we estimated the accuracy dependence on power and distance as shown in Fig.  6(a) and 6(b), respectively. With decreasing the scanner output power, the depth accuracy gradually got worse especially for the condition where the target was placed 100 cm away from the camera. Such rapid deterioration in the depth accuracy also told us that the sunlight shot noise is becoming dominant rather than speckle pattern when the power is lower than 0.3 mW. Besides, the power variation of our beam scanner under different wavelengths now may also cause stronger accuracy variation. We have 3 methods to reduce the variation: Firstly, the power variation under different wavelengths could be compensated by injecting different currents at a scanner side to realize power equalization [30]. Secondly, the power degradation always happens at the edge of the entire scanning range. We could give up this range and compensate the FoV by using a DOE [35]. Thirdly, a heater could be integrated to the tunable VCSEL to reduce the power variation of VCSEL due to high or large changes in injection current.

From Fig. 6 (b), we could find that the relative accuracy dependence on distance is not definitely linear anymore especially when the detected distance is long, because not only triangulation of imaging system but also the sunlight shot noise would affect the depth accuracy. To pursue better accuracy at long-distance range, we need to firstly overcome the effect of shot noise. The easiest method is increasing beam scanner power. In this paper, the maximum power of a beam scanner is only 1 mW. However, the same-structure devices have reached 8 mW [23] previously and another similar-structure device using slow-light VCSEL has reached 3W [22]. Also, the passband of a band-pass filter used in this paper is 150 nm, while in real applications the passband is <10 nm to significantly improve the signal to noise ratio. Although there are still speckle noise, at strong background light environment and long-distance measurements, the speckle noise may not be dominant. Further improvements for speckle noise could be expected by spatial averaging process [36].

 figure: Fig. 6.

Fig. 6. (a) Accuracy dependence on power and (b) accuracy dependence on distance after adding sunlight shot noise

Download Full Size | PDF

6. Discussion

6.1 Potential frame rate and lateral resolution

Firstly, we studied the maximum framerate and potential number of lines per second ${N_{total}}$ of our system. For larger ${N_{total}}$, it could be easily imagined that the pulse width or shutter exposure t for single strip will be smaller, which reduces the energy that received by each pixel of the camera. In details, the relation between pulse width t and total number of lines per second ${N_{total}}$ is shown in Eq. (7), if the duty ratio remains 72% as we did in the experiment.

$$t = 0.72/{N_{total}}$$

In our experiment, it is difficult to really measure the real-time depth information with ${N_{total}}$ of larger than 500 due to the limitation of the shutter speed of the camera. To further discuss the potential of highest ${N_{total}}$, we adjusted the pulse width and shutter exposure time as Eq. (7) for depth sensing with different ${N_{total}}$ and estimated the depth accuracy by measuring 20 frames. The injection currents to the VCSEL and beam scanner were 3.5mA and 40mA, respectively. The condition with sunlight and without sunlight were both discussed in Fig. 7. The sunlight shot noise was also estimated by Eqs. (5) and (6). The dashed line and solid line correspond to the depth accuracy estimated with sunlight and without sunlight, respectively. The target distance is 35 cm. Even if the ${N_{total}}$ is increased to 2,000, the accuracy could also be < 0.5mm both for the environment with and without sunlight. It means the number of lines per frame could be 100 when the framerate is 20fps as estimated by Eq. (2). Therefore, by using a high-speed camera with a shutter speed of >2,000fps that has been equipped with some mobile phones, the lateral resolution could be easily increased to 50,000 (100${\times} $ 500) with the framerate of 20fps and optical power of 1 mW. Besides, if a 10-spot DOE is added to the scanner, the resolution could be further increased by 10 times as proposed in Section 6.2 later and reaches 500,000 with the same framerate. Because optical power will be also split to 10 lines by the DOE, the power needs to be increased by 10 times to 10 mW so that the accuracy could be unchanged. Taking some mature products from Microsoft, Apple, and Intel for comparison, the highest lateral resolution could be realized with a commonly acceptable framerate and rather low optical power requirement as shown in Table 3.

 figure: Fig. 7.

Fig. 7. Depth accuracy as a function of total number of lines per second.

Download Full Size | PDF

Tables Icon

Table 3. Comparison of different structured-light technologies

The sine-wave voltage (injection current) with different frequencies was added to the VCSEL to confirm the available scanning frequency of a beam scanner. It could be found the 3 dB cutoff frequency of the scanning speed is over 100kHz, which is limited by the tuning speed of an electro-thermal tunable VCSEL. It could be high enough for 1D beam scanning of the structured-light sensing.

6.2 Compact beam scanning module with larger FoV and higher resolution

For most application of structured-light sensing such as face identification and household robots, 30$^\circ{\times} $ 30$^\circ $ may be needed at least. However, the FoV of a VCSEL beam scanner is only 6$^\circ{\times} $ 12$^\circ $, which is limited by the wavelength tuning range of a VCSEL. By better fabrication process, we realized a same-structure beam scanner with a larger tuning range of >7 nm, which provides a scanning range of >16° [23]. Besides, a counter-propagation VCSEL beam scanner could be integrated to cover a symmetrical FoV from the vertical direction, which will make the FoV double to 32$^\circ $. Recently, we proposed the use of a diffractive optical element (DOE) to improve the FoV and later resolution of a VCSEL beam scanner [35]. The module structure was illustrated in the top figure in Fig. 8 (a). It is composed of a VCSEL scanner, prism mirror and 1D DOE that are simply stacked on the scanner chip. The prism mirror is used to direct the light to a proper emission angle and the DOE could split an original beam to 10 beams with a separation angle of around 4$^\circ $. When the beam was scanned by 4$^\circ $, the separation between beams will be just fulfilled and therefore the FoV could be expanded to 40$^\circ $. This beam scanning module has a great potential to be used in structured-light sensing for larger FoV and higher resolution. We demonstrated its application by capturing the beam reflected by the target shown in Fig. 8 (b). The target consists of a fixed black diffuser (back) with a reflectivity of 10% and a grey diffuser with a reflectivity of 30% hold in hand (front). The black and grey diffuser was placed at 66 cm and around 52 cm away from the scanner and the camera. When the line beam was fixed, the reflected image was captured in Fig. 8 (c). Because the FoV of camera is only 27$^\circ{\times} $ 35$^\circ $, 7 of 10 beams could be observed. The overlapped beam with beam scanning by 10 step was shown in Fig. 8 (d). It was captured by real-time measurement with a framerate of 20fps. The structured light deformed by the target could be clearly observed. The actual FoV was greatly increased to 40$^\circ{\times} $ 20$^\circ $ with 100 beams. In this case, the lateral resolution of structured-light sensing could easily reach 50,000 (100${\times} $500) by using camera with speed of only 200fps to realize real-time sensing with framerate of 20fps. The counter-propagation integration of beam scanners could further make the FoV and resolution double. We also demonstrated a 1D scanner, exhibiting a FoV of >100$^\circ{\times} $ 20$^\circ $ and resolution of >1400. It could increase the lateral resolution of structured-light sensing to 700,000 if the power could be increased correspondingly [35].

 figure: Fig. 8.

Fig. 8. (a) Module of line (Top) and spot (Bottom) beam scanning (b) Target to reflect the beams (c) Fixed line beam and (d) superimposed scanning beam reflected by the target.

Download Full Size | PDF

Compared to a line beam, a spot beam has much higher beam intensity (>10-time improvement). It offers lower power consumption, which is especially important for mobile devices. Cylindrical lens was used to collimate the line beam to a spot beam and realize 2D beam scanning [37,38]. The FoV could also be double by using counter-propagation integration of scanners. The module size of both line and spot scanning module mentioned could be managed within mm scale. The further miniature of module size could be expected by using flat reflector instead of prism mirror [39,40]

7. Summary

Real-time structured-light depth sensing was realized based on our solid-state beam scanner integrated with a tunable VCSEL. Rather than the flash sensing, a scanning function gives an improved signal to noise ratio in outdoor applications for lower power consumption. In our experiment, a depth accuracy could stay at less than 300$\mu $m for 35 cm target distance with a low output power of smaller than 1mW even if sunlight shot noise is added to the system. It also shows great accuracy stability with lateral resolution points number increasing to 50,000 (100${\times} $500) with a framerate of 20 fps. For high-speed applications, a framerate of 200fps if the lateral resolution number could be compromised to 5,000 (100${\times} $500). Currently, the system is dominantly affected by speckle pattern thanks to high signal to noise ratio, but it could be improved by spatial and temporal average [41,42], which could provide more space for extremely accurate depth sensing. Besides, a line beam scanning module with DOE was also demonstrated. The lateral resolution number and FoV could be improved to >50,000 and 40$^\circ{\times} $ 20$^\circ $ by using the proposed line beam scanning module. Furthermore, on-chip small size and low manufactory cost also provides the system more possibilities to be applied in consumer devices.

Funding

Japan Science and Technology Agency; Adaptable and Seamless Technology Transfer Program through Target-Driven R and D (JPMJTR211A).

Disclosures

The authors declare no conflicts of interest.

Data availability

No data were generated or analyzed in the presented research.

References

1. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128 (2011). [CrossRef]  

2. K. Iga, “Forty years of vertical-cavity surface-emitting laser: Invention and innovation,” Jpn. J. Appl. Phys. 57(8), 08PA01 (2018). [CrossRef]  

3. K. J. Ebeling, R. Michalzik, and H. Moench, “Vertical-cavity surface-emitting laser technology applications with focus on sensors and three-dimensional imaging,” Jpn. J. Appl. Phys. 57(8), 08PA02 (2018). [CrossRef]  

4. Z. Mor and B. Morgenstein, “Overlapping pattern projector,” (2018).

5. S. McEldowney, “Depth projector system with integrated VCSEL array,” (2012).

6. B. Pesach and Z. Mor, “Projectors of structured light,” (2014).

7. S. Zhang, “High-speed 3D shape measurement with structured light methods: A review,” Opt. Lasers Eng. 106, 119–131 (2018). [CrossRef]  

8. O. Hall-Holt and S. Rusinkiewicz, “Stripe boundary codes for real-time structured-light range scanning of moving objects,” Proc. IEEE Int. Conf. Comput. Vis.2, 359–366 (2001).

9. Y. An, J.-S. Hyun, and S. Zhang, “Pixel-wise absolute phase unwrapping using geometric constraints of structured light system,” Opt. Express 24(16), 18445–18459 (2016). [CrossRef]  

10. S. T. S. Holmström, U. Baran, and H. Urey, “MEMS laser scanners: A review,” J. Microelectromech. Syst. 23(2), 259–275 (2014). [CrossRef]  

11. C. Winter, L. Fabre, F. Lo Conte, L. Kilcher, F. Kechana, N. Abelé, and M. Kayal, “Micro-beamer based on MEMS micro-mirrors and laser light source,” Procedia Chem. 1(1), 1311–1314 (2009). [CrossRef]  

12. H. Schenk, J. Grahmann, T. Sandner, M. Wagner, U. Dauderstädt, and J. U. Schmidt, “Micro mirrors for high-speed laser deflection and patterning,” Phys. Procedia 56(C), 7–18 (2014). [CrossRef]  

13. Y. Wang, G. Zhou, X. Zhang, K. Kwon, P.-A. Blanche, N. Triesault, K. Yu, and M. C. Wu, “2D broadband beamsteering with large-scale MEMS optical phased array,” Optica 6(5), 557 (2019). [CrossRef]  

14. J. K. Doylend, M. J. R. Heck, J. T. Bovington, J. D. Peters, L. A. Coldren, and J. E. Bowers, “Two-dimensional free-space beam steering with an optical phased array on silicon-on-insulator,” Opt. Express 19(22), 21595–21604 (2011). [CrossRef]  

15. J. Sun, E. Timurdogan, A. Yaacobi, E. S. Hosseini, and M. R. Watts, “Large-scale nanophotonic phased array,” Nature 493(7431), 195–199 (2013). [CrossRef]  

16. X. Zhang, K. Kwon, J. Henriksson, J. Luo, and M. C. Wu, “Large-scale Silicon Photonics Focal Plane Switch Array for Optical Beam Steering,” in Optical Fiber Communication Conference (OFC) 2021 (2021), pp. 5–7.

17. D. Inoue, T. Ichikawa, A. Kawasaki, and T. Yamashita, “Demonstration of a new optical scanner using silicon photonics integrated circuit,” Opt. Express 27(3), 2499 (2019). [CrossRef]  

18. H. Ito, Y. Kusunoki, J. Maeda, D. Akiyama, N. Kodama, H. Abe, R. Tetsuya, and T. Baba, “Wide beam steering by slow-light waveguide gratings and a prism lens,” Optica 7(1), 47 (2020). [CrossRef]  

19. X. Gu, T. Shimada, and F. Koyama, “Giant and high-resolution beam steering using slow-light waveguide amplifier,” Opt. Express 19(23), 22675 (2011). [CrossRef]  

20. M. Nakahama, X. Gu, A. Matsutani, T. Sakaguchi, and F. Koyama, “High Power Non-mechanical Beam Scanner based on VCSEL Amplifier,” 2016 21st Optoelectron. Commun. Conf. held jointly with 2016 Int. Conf. Photonics Switch.1, 4–6 (2016).

21. X. Gu, T. Shimada, A. Matsutani, and F. Koyama, “Miniature nonmechanical beam deflector based on bragg reflector waveguide with a number of resolution points larger than 1000,” IEEE Photonics J. 4(5), 1712–1719 (2012). [CrossRef]  

22. S. Hu, A. Hassan, X. Gu, M. Nakahama, S. Shinada, and F. Koyama, “Surface grating VCSEL-integrated amplifier/beam scanner with high power and single mode operation,” Appl. Phys. Express 14(6), 062005 (2021). [CrossRef]  

23. S. Hu, X. Gu, M. Nakahama, and F. Koyama, “Non-mechanical beam scanner based on VCSEL integrated amplifier with resonant wavelength detuning design,” Chin. Opt. Lett. 19(12), 121403 (2021). [CrossRef]  

24. S. Hu, M. Takanohashi, X. Gu, K. Shimura, and F. Koyama, “Lateral Integration of VCSEL and Amplifier with Resonant Wavelength Detuning Design,” in Conference on Lasers and Electro-Optics (OSA, 2019), p. SM4N.3.

25. X. Gu, T. Shimada, A. Fuchida, A. Matsutani, A. Imamura, and F. Koyama, “Beam steering in GaInAs/GaAs slow-light Bragg reflector waveguide amplifier,” Appl. Phys. Lett. 99(21), 211107 (2011). [CrossRef]  

26. T. Jia, Z. Zhou, and H. Gao, “Depth measurement based on infrared coded structured light,” J. Sensors 44(5), 1628–1632 (2014). [CrossRef]  

27. R. Li, Z. Ho, X. Gu, and F. Koyama, “Wide-range Structured-Light Sensing Based on Non-mechanical VCSEL Beam Scanner,” in MOC2019 (2019), pp. 10–11.

28. R. G. Dorsch, G. Häusler, and J. M. Herrmann, “Laser triangulation: fundamental uncertainty in distance measurement,” Appl. Opt. 33(7), 1306 (1994). [CrossRef]  

29. R. Baribeau and M. Rioux, “Centroid fluctuations of speckled targets,” Appl. Opt. 30(26), 3752 (1991). [CrossRef]  

30. D. D. Duncan and S. J. Kirkpatrick, “Algorithms for simulation of speckle (laser and otherwise),” Complex Dyn. Fluctuations Biomed. Photonics V 6855, 685505 (2008). [CrossRef]  

31. J. W. Goodman, “Statistical properties of laser speckle patterns,” in Laser Speckle and Related Phenomena (Springer, 1975), pp. 9–75.

32. D. Ilstrup and R. Manduchi, “Active triangulation in the outdoors: A photometric analysis,” Fifth Int. Symp. 3D Data Process. Vis. Transm.1–8 (2010).

33. H. Liu, J. Gao, V. P. Bui, Z. Liu, K. E. K. Lee, L.-S. Peh, and C. E. Png, “Detectability of active triangulation range finder: a solar irradiance approach,” Opt. Express 24(13), 14851 (2016). [CrossRef]  

34. X. Ma, C. Rao, and H. Zheng, “Error analysis of CCD-based point source centroid computation under the background light,” Opt. Express 17(10), 8525 (2009). [CrossRef]  

35. R. Li, S. Hu, X. Gu, and F. Koyama, “Compact solid-state vertical-cavity-surface-emitting-laser beam scanning module with ultra-large field of view,” Appl. Phys. Express 14(11), 112005 (2021). [CrossRef]  

36. J. Y. Lee, T. H. Kim, J. U. Bu, and Y. J. Kim, “Speckle reduction using twin green laser diodes and oscillation of MEMS scanning mirror for pico-projector,” inMOC 2015 - Technical Digest of 20th Microoptics Conference (2016).

37. K. Kondo, X. Gu, Z. Ho, A. Matsutani, and F. Koyama, “Two-Dimensional Beam Steering Device Based on VCSEL Slow-Light Waveguide Array with Amplifier Function,” in 2019 Optical Fiber Communications Conference and Exhibition (OFC) (2019), pp. 1–3.

38. J. Maeda, D. Akiyama, H. Ito, H. Abe, and T. Baba, “Prism lens for beam collimation in a silicon photonic crystal beam-steering device,” Opt. Lett. 44(23), 5780 (2019). [CrossRef]  

39. D. Wang, Q. Fan, J. Wang, Z. Zhang, Y. Liang, and T. Xu, “All-dielectric metasurface beam deflector at the visible frequencies,” Guangdian Gongcheng/Opto-Electronic Eng. 44(1), 103–107 (2017). [CrossRef]  

40. T. Aoyagi, Y. Aoyagi, and S. Namba, “High-efficiency blazed grating couplers,” Appl. Phys. Lett. 29(5), 303–304 (1976). [CrossRef]  

41. Y. Kuratomi, K. Sekiya, H. Satoh, T. Tomiyama, T. Kawakami, B. Katagiri, Y. Suzuki, and T. Uchida, “Speckle reduction mechanism in laser rear projection displays using a small moving diffuser,” J. Opt. Soc. Am. A 27(8), 1812 (2010). [CrossRef]  

42. J. Pauwels and G. Verschaffelt, “Laser speckle reduction based on partial spatial coherence and microlens-array screens,” Proc. SPIE (May 2018), 22 (2018).

Data availability

No data were generated or analyzed in the presented research.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1.
Fig. 1. (a) The schematic of beam scanner integrated with a tunable VCSEL (b) Reflected pattern of device (c) Output power of device (d) Beam scanning frequency response
Fig. 2.
Fig. 2. Schematic of real-time structured-light sensing system
Fig. 3.
Fig. 3. (a) Overlapped reflected pattern of target (b) Depth information of target (c) The depth information of specific line (14th line)
Fig. 4.
Fig. 4. Real-time measurements of 10 frames of depth data measured in 1 second
Fig. 5.
Fig. 5. (a) Accuracy dependence on power (b) Accuracy dependence on distance
Fig. 6.
Fig. 6. (a) Accuracy dependence on power and (b) accuracy dependence on distance after adding sunlight shot noise
Fig. 7.
Fig. 7. Depth accuracy as a function of total number of lines per second.
Fig. 8.
Fig. 8. (a) Module of line (Top) and spot (Bottom) beam scanning (b) Target to reflect the beams (c) Fixed line beam and (d) superimposed scanning beam reflected by the target.

Tables (3)

Tables Icon

Table 1. The basic experimental condition

Tables Icon

Table 2. Parameters for estimation of sunlight received by camera

Tables Icon

Table 3. Comparison of different structured-light technologies

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

sin θ r = n w g × 1 ( λ λ c ) 2
N t o t a l = f c a m e r a = N f s c a n
d = Δ y D 2 f b ± Δ y D
Δ d = Δ y 2 λ n D 2 f b ± Δ y 2 λ n D Δ y 1 λ n D 2 f b ± Δ y 1 λ n D
B s o l a r = η p t ρ s 2 4 F 2 h c 800 950 λ I ( λ ) d λ
n s h o t = τ B s o l a r
t = 0.72 / N t o t a l
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.