Abstract
Fringe projection profilometry has been widely used in high-speed three-dimensional (3D) shape measurement. To improve the speed without loss of accuracy, we present a novel single-shot 3D shape measuring system that utilizes a coaxial fringe projection system and a 2CCD camera. The coaxial fringe projection system, comprising a visible light (red, green, and blue) projector and an infrared (IR) light projector, can simultaneously project red, green, blue, and IR fringe patterns. The 2CCD camera, as the name suggests, has two CCD chips that can acquire visible and IR fringe patterns at the same time. Combining the two-step phase-shifting algorithm, Fourier transform profilometry, and the optimum three-frequency selection method, 3D shape measurement of complex surfaces such as large slopes or discontinuous objects can be obtained from single-shot acquisition. A virtual fringe projection measurement system has been established to generate pre-deformed fringe patterns to correct positional deviations of the coaxial fringe projection system. This method has been applied to simulations and experiments on static and dynamic objects with promising results.
Published by The Optical Society under the terms of the Creative Commons Attribution 4.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.
1. INTRODUCTION
During recent years, high-speed three-dimensional (3D) shape measurement has been utilized in various applications, including industrial on-line detection, virtual reality, biometric identification, cultural heritage preservation, reverse engineering, and security systems [1]. Fringe projection profilometry (FPP) is a set of main techniques, with the ability to provide non-contact, full-field, accurate, and high-speed 3D shape measurement [2,3]. Based on the procedure of phase retrieval, traditional FPP can be classified into two categories: Fourier transform profilometry (FTP) [4,5] and phase-shifting profilometry (PSP) [6–8]. FTP utilizes a single fringe pattern to determine the unwrapped phase with high speed, although it is a challenge to overcome the band overlapping problem. PSP calculates the phase based on the intensity of multiple phase shift fringe patterns with high accuracy and resolution, but it needs more than three patterns to retrieve the unwrapped phase. When complex surfaces are involved, several additional fringe patterns are needed to obtain the absolute phase with a temporal phase-unwrapping algorithm [9,10]. Also, it is a challenge to use fewer fringe numbers, obtain higher quality phase, and retrieve the absolute phase for FPP. Many high-speed techniques have already been proposed, including a dual frequency phase-shifting [11] algorithm, phase-shifting algorithm [12], utilizing multicolor RGB channels of a projector [6,13,14], or color encoded approaches [15,16]. Binary patterns have also been demonstrated for achieving high projection speed [17–19]. A secondary color camera and an infrared (IR) fringe projection system have also been proposed to simultaneously capture natural 2D color texture and 3D shape in real time [20,21].
In this paper, we propose a single-shot measurement approach to increase the speed with fewer fringe numbers. It can obtain high absolute phase and overcome the problem of retrieving absolute phase for objects that may be discontinuous or have large slopes. First, we present the coaxial projection measurement system comprising of an IR light projector (ILP), a visible light projector (VLP), a beam splitter (BS), and a dual-sensor charge coupled device (2CCD). The coaxial fringe projection system projects two fringe patterns (one composite color fringe pattern and one IR fringe pattern) onto the object, and the 2CCD captures two deformed fringe patterns synchronously. Then we propose an algorithm for phase extraction that combines the two-step phase-shifting (TPS) algorithm [22], FTP, and the optimum three-frequency selection method [23], allowing to recover the 3D shape of objects with single-shot measurement. Also, a virtual fringe projection system is established to obtain the pre-deformed fringe patterns for correction of positional deviations of the coaxial fringe projection measurement system. Finally, several complex test surfaces have been used to validate the proposed method, including a mask, a set of steps, and a moving human hand. The remainder of this paper is organized as follows: Section 2 describes the principle of the proposed single-shot measurement; Section 3 presents the technique to correct deviations of the coaxial projection system; simulation and actual experiments are reported in Section 4; and Section 5 presents the conclusions.
2. PRINCIPLE
In this section, we describe the principle of FPP, the coaxial projection measurement system, and phase calculation of the proposed single-shot measurement.
A. Fringe Projection Profilometry
Figure 1 shows the geometry of FPP in which the optical axis of a projector lens intersects the optical axis of a camera lens at point on a reference plane , which serves as the reference from which object height is measured. is the difference between the unwrapped phase on the measured object and the reference plane. is the angle between and . is located at a distance from plane , and is the distance between and . , are two points on the reference plane , and is a point on the object surface. is the period of the projected fringe pattern on a virtual plane perpendicular to the projection axis.
The relationship between unwrapped phase and height data is a complicated function of pixel coordinate positions of and , which can be represented by the following equation [24]:
In a practical system, it is difficult to determine the height using the parameters of the optical setup, since it is impossible to measure system parameters , , , and accurately. Here, we can use a polynomial-based calibration method to build up the relationship between unwrapped phase and height data: where is a coefficient set containing the systematic parameters.B. Coaxial Projection Measurement System
The schematic diagram of the coaxial projection system is shown in Fig. 2. It comprises an ILP, a VLP, a BS, and a 2CCD camera. The BS is a 50:50 BS with transmittance and reflectance of 50% each. The visible light projected by VLP is transmitted through the BS to the objects, while the IR light projected by ILP is reflected from the BS. The virtual image of ILP is overlapped with VLP via the BS. Two projectors are placed at identical equivalent spatial positions, capable of projecting red, green, blue, and IR fringe patterns. The 2CCD camera has two chip sensors to acquire the visible color image and IR image synchronously.
Four sinusoidal fringe patterns are encoded into four color channels, comprising one composite color fringe pattern and one IR fringe patterns. Two fringe patterns are simultaneously projected onto the surface of the object by the coaxial projection system, and 2CCD synchronously captures the deformed composite and IR fringe pattern. With the proposed algorithm for phase calculation, we can recover the absolute phase of objects with single-shot measurement.
C. Phase Calculation
In the current approach, one composite color fringe pattern is encoded with two sinusoidal fringe patterns in its red and blue channels to establish TPS for wrapping the primary phase, whereas one sinusoidal fringe pattern in its green channel is encoded to accomplish FTP. Additionally, one IR sinusoidal fringe pattern is utilized to establish FTP. Four ideal independent patterns for four channels are shown in Fig. 3. The intensity of four channels can be described as
where are the intensity, background intensity, intensity modulation of the red, blue, and green of the composite color fringe pattern and IR components, denotes the fundamental frequency of sinusoidal fringe, and is the phase to be solved.1. Phase Calculation by TPS
After removing the background intensity using the mean envelope method [25], two sinusoidal fringes in the red and blue components of the composite fringe pattern can be rewritten as
The wrapped phase can be computed by2. Phase Calculation by FTP
In the 2D FTP case, the green channel of the composite color fringe pattern and the IR fringe pattern can be rewritten as
where * denotes complex conjugation. After background intensity removal, Eq. (9) is Fourier transformed with respect to and to obtain By the use of a filter function, we can obtain the function . Then we can extract the wrapped phase by the inverse Fourier transform of : where Re and Im represent the real and imaginary parts of , respectively.While utilizing FTP to calculate the wrapped phase, leakage effects are inevitable, especially with discontinuous surfaces. The phase errors in the neighborhood of discontinuities can lead to inaccurate reconstructions. However, they can be eliminated by setting an appropriate low modulation threshold to identify valid data [14].
The measurement range of traditional FTP is [26]
After removing the background intensity of the deformed fringe pattern, the zero component is subtracted from the Fourier spectrum of the deformed stripe pattern. Compared to traditional FTP, the measurement range can be extended by nearly three times: Overall, three sets of wrapped phases are obtained, and to gain the unwrapped phase, we select the optimum three-frequency method with fringe numbers of , , to resolves fringe order ambiguity, as the beat obtained between and is a single fringe over the full field of view [23]. It can be used to calculate the unwrapped phase of complex objects with large slopes and discontinuities point by point independently.3. DEVIATION CORRECTION OF COAXIAL PROJECTION SYSTEM
The accuracy of the coaxial projection system cannot be guaranteed by hardware system adjustments only. Hence, a robust correspondence needs to be established between two projectors. A virtual fringe projection system based on a camera and projector model is proposed to verify the impact of positional deviations on phase calculation, and the acceptable position deviation range of the coaxial projection system. After calibrating the coaxial projectors synchronously, pre-deformed fringe patterns can be obtained by a virtual fringe projection system to correct positional deviations.
A. Virtual Fringe Projection System
The virtual projection system based on the camera and projector model is shown in Fig. 4.
1. Camera and Projector Model
In Fig. 4, , , represent the camera, projector, and world coordinate system, respectively, and , represent the camera and projector image coordinate system, respectively. is an arbitrary point on the object’s surface, while and represent its corresponding imaging point on the imaging plane and projection point on the projection plane, respectively.
The relationship between a point on the object and its imaging point can be described as follows based on the pinhole camera model [27]:
where is the homogeneous coordinate of the imaging point in the is the homogeneous coordinate of the point in , and is the intrinsic parameters matrix of the camera; is the coordinate of principal point in , and are the equivalent focal lengths along the two orthogonal directions defined by the camera image plane, and is a scale factor. The extrinsic parameters matrix represents the relative rotation and translation between and , respectively, and denotes the transpose. is the projection matrix of the camera.Actually, the pinhole camera model mentioned above is an ideal model that does not take into account the effects of camera lens distortion. In the actual measurement system, the camera lens usually exhibits nonlinear optical distortion. In practice, the lens distortion is modeled by
where denotes the normalized location of a distortion-free point , denotes the distorted point in the imaging plane, and are the radial distortion coefficients, and and are the tangential distortion coefficients. The relationship between distorted point and distortion-free point is as follows: Similarly, the projector can be viewed as an inverse camera with similar lens distortion [28]. The relationship between a point on the object and its projection point in is established: where is the homogeneous coordinate of the projection point in the , is the intrinsic parameters matrix of the projector, including the principal point , and and are the focal lengths along the and axes of the projection plane, respectively. The extrinsic parameters matrix relates the to the . The rotational matrix is a matrix, and is a rotation vector associated with . is a translation vector. is the projection matrix of the projector.2. Simulation of the Imaged Object in the Camera Coordinate System
The virtual projection system is set up in . The equation of measured objects can be expressed as follows:
For point , the corresponding pixel coordinates can be obtained by the camera model: Here, is the homogeneous coordinate of the imaging point in .With the known point in and the imaging point in , the projection matrix of the camera from to can be calculated by Eq. (15). Equation (15) is multiplied by and can be rewritten as
Similarly, the projection matrix of the projector from to can be calculated by Eq. (18). With the same point , we substitute Eq. (21) into Eq. (18). The correspondence relationship between the camera and the projector can be established: where .Substituting Eq. (20) into Eq. (22), Eq. (22) is reformulated as
Camera intrinsic parameters such as , , , , can be predefined, so for each point , the corresponding coordinate can be obtained. Similarly, the corresponding coordinate can be obtained by Eq. (23). Hence, we have established the correspondence between the camera image coordinate of the measured object and the projector image coordinates. Using the principle of fringe projection, the corresponding deformed fringe patterns can be generated.3. Simulation System Verification
To verify the validity of the proposed simulation system, a simulated experiment is conducted, as shown in Fig. 5.
The intrinsic parameters of the virtual camera and projector in the simulation experiment are identical: resolution is , physical size of a pixel on the chip , and focal length . The extrinsic parameters are set as follows: distance from the reference plane to the center of the virtual projector or virtual camera, . The reference plane is set at in the virtual camera coordinate system.
The virtual projector projects a fringe pattern with a fringe number of 36 onto the reference plane, and the virtual camera captures the deformed fringe pattern, as shown in Fig. 6. Figure 6 shows that the captured fringe pattern is identical to the fringe pattern projected by the virtual projector, which is consistent with the actual arrangement.
4. Analyzing the Impact of Positional Deviations on Phase Calculation
To analyze the effect of position deviations of two projectors on phase calculation, the system is simulated with realistic parameters. A CCD camera with a sensor size of and a focal length of 12 mm is selected as the camera; two digital light processing (DLP) projectors with a micromirror pitch of and a focal length of 12 mm are selected as the projectors. The center of the virtual camera to the reference plane distance , and the center of the virtual projector to the reference plane distance . It is assumed that projector and projector are not completely coaxial with rotation vector and translation vector , respectively.
Virtual projectors and project fringe patterns with optimum numbers onto a planar board. One set of fringe patterns is projected onto a planar board by single projector , and the other set of fringe patterns is partly projected by projector and projector onto the planar board. Two sets of unwrapped phases are shown in Figs. 7(a) and 7(b). It is clear that Fig. 7(b) is inaccurate.
The deviations of coaxial projectors consist of six parameters: rotation vector and translation vector . The rotation vector represents the angle of rotation around the axis, axis, and axis. and can be transformed by Rodrigues.
To obtain the allowable deviations of coaxial projectors, we assume one of the parameters is changed and the other parameters are fixed values of zero. Iterating the process of projecting two sets of fringe patterns onto a planar board, the error of the two sets of the phase is obtained. We change the parameters from 0 to the maximum allowable deviation, which corresponds to the critical value of zero value of the phase error with a reasonable step.
Iterating the process of the other parameters, we can reasonably obtain the allowable range of the other parameters. The allowable deviations of coaxial projectors are shown in Table 1.
B. Deviation Correction of Coaxial Projectors
Step 1: Calibrate two projectors and obtain the of two projectors.
To calibrate a projector like a camera, “captured images” need to be created for a projector. A calibration board with arrays of black circular rings printed on a flat white board is positioned with some different orientations. Using phase mapping method, the captured images can be obtained [29]. Then we can use the MATLAB calibration toolbox to calibrate the projector essentially the same as that of a camera.
For simplicity, we denote VLP as and ILP as . is the homogeneous coordinate of the projection point in the . is the homogeneous coordinate of the projection point in the :
Step 2: Calculate the position deviations of two projectors.With the same world coordinates, the transformation matrix (position deviations) , from to can be calculated as
Step 3: Obtain a pre-deformed fringe pattern by the simulation system established in Section 3.A.When the intrinsic and extrinsic parameters of the virtual camera are identical to those of the virtual projector, fringe patterns captured by the virtual camera on the reference plane are identical to the patterns projected. Hence, it can be assumed that the intrinsic and extrinsic parameters of the virtual camera are identical to . Simulated sinusoidal fringes are projected by on the reference plane, and the deformed fringe patterns are captured by the virtual camera. This process is equivalent to the projection transformation from to . A sinusoidal fringe pattern projected by undergoes rotation and translation transformations [], and becomes the pre-deformed fringe pattern projected by . So the required pre-deformed fringe pattern projected by is the deformed fringe patterns captured by the virtual camera.
C. Deviation Correction Experiment
To validate the proposed method, the coaxial projection measurement system is set up as shown in Fig. 8. In the current setup, the VLP is a Lightcrafter 4500 from TI with a resolution of . The ILP is a Lightcrafter 4500 from TI, whose digital micromirror device and light source have been replaced by the corresponding IR components. The 2CCD camera has model number AD-080GE from JAI with a 12 mm prime lens.
As aforementioned, we used a total of nine different poses for projector calibration. The captured images created for projector calibration are shown in Fig. 9. Both the VLP and ILP are calibrated. Their intrinsic parameters and coefficients of lens distortion , , , are shown in Table 2. The reprojection errors of two projectors are shown in Fig. 10. The average values of the reprojection errors of VLP and ILP are 0.2690 pixels and 0.2210 pixels, respectively. The translation matrices [] of the two projectors are calculated by Eq. (27).
According to deviations [, ] of the coaxial projectors, pre-deformed fringe patterns can be generated by a simulation system.
VLP and ILP are calibrated after deviation correction, and deviations of coaxial projectors are shown in Table 3. It can be observed that the position deviations are much smaller than allowable deviations obtained in Table 1.
To further verify the effect of deviation correction in actual measurements, a set of phase-shifted fringe patterns is projected onto a planar board using three methods: (1) single projector; (2) coaxial projectors without deviation correction; and (3) coaxial projectors after deviation correction. The profile along row 130 for three unwrapped phase maps of the planar board is calculated as shown in Fig. 11.
It can be clearly seen that the unwrapped phase obtained by coaxial projectors after deviation correction (dotted line) is nearly identical to the single projector case (solid line); the unwrapped phase obtained by coaxial projectors without deviation correction (dashed line) and the single projector case (solid line) differ greatly in phase value. The partially enlarged view is shown in the black rectangle.
By using the unwrapped phase of a single projector as the standard data, we calculate the phase errors of the coaxial projectors without deviation correction and coaxial projectors after deviation correction. The profiles along row 130 for phase errors are shown in Figs. 12(a) and 12(b), respectively. The root mean square (RMS) in Figs. 12(a) and 12(b) is 0.277 and 0.031, indicating the deviation correction method is effective.
4. EXPERIMENTS
To demonstrate the validity of the proposed method, simulated and actual experiments have been carried out in Sections 4.A and in 4.B.
A. Simulations and Results
To illustrate the performance of the proposed phase extraction methods, the height distribution in the simulation given as peaks is shown in Fig. 13(a). The peaks are a function defined in MATLAB:
In an actual situation, the reconstructed phase is affected by random noise in the captured fringe patterns, so 3% random noise is added to the gray scale of the original fringe patterns, and the deformed fringe pattern of the simulated object is shown in Fig. 13(b).Taking three sets of fringe patterns with the optimum fringe numbers of 36, 35, and 30, the wrapped phase maps are obtained by traditional FTP, improved FTP, the proposed method (one set of wrapped phase map by TPS, two sets of wrapped phase map by the improved FTP), and a four-step phase-shifting algorithm. Then the unwrapped phase map is obtained by the optimum three-frequency selection method, as shown in Fig. 14. In Fig. 14(a), we can see that the phase error appears at the position of the large slopes, as the traditional FTP is limited by the gradient of the object. Figure 14(b) shows that the improved FTP can calculate the phase of the large slopes accurately. By using the four-step phase-shifting algorithm [Fig. 14(d)] as the criterion, the profile along row 760 for phase errors of the other three maps are calculated as shown in Fig. 15. The RMS errors are 0.1316 rad, 0.0408 rad, and 0.0099 rad, respectively. The RMS data demonstrate that the proposed method algorithm has much higher accuracy than the other two methods.
B. Actual Experiments
To verify the performance of the proposed coaxial projection system after deviation correction, the 3D shape of a mask, a set of steps, and a human hand are measured. The crosstalk coefficient calibration method is used to calculate the crosstalk coefficient between the color channels and compensate for the energy leaked from other channels.
One composite color fringe pattern with optimum fringe numbers of 36, an assigned phase-shifting value of between red and blue components, along with an optimum fringe number of 35 in green components is projected by VLP, as shown in Fig. 16(a). The pre-deformed fringe pattern with an optimum fringe number of 30 is encoded into the IR channel of ILP, as shown in Fig. 16(b). The images captured by the 2CCD representing the deformed composite RGB and IR fringe patterns of the mask are shown in Figs. 16(c) and 16(d), respectively. TPS is applied to the red and blue components of the composite RGB fringe pattern to obtain the wrapped phase, as shown in Fig. 17(a). FTP is applied to the green component and IR component to obtain two wrapped phase maps, as shown in Figs. 17(b) and 17(c). Using the optimum three-frequency method, the unwrapped phase of a mask is shown in Fig. 17(d). Finally, with the calibrated system, the 3D shape data of the mask are shown in Fig. 18.
Following similar methodology, a set of steps and a human hand are also measured. The captured deformed composite RGB and IR fringe patterns of a set of steps are shown in Figs. 19(a) and 19(b), the phase maps of a set of steps are shown in Fig. 20, and 3D shape data of it are shown in Fig. 21. 3D shape data of the human hand derived from the 3D movie are shown in Fig. 22. These measurements show that the system developed in this paper can measure both discontinuous objects and dynamic objects.
5. CONCLUSION
In this paper, a novel 3D shape measurement system is proposed based on a coaxial fringe projection system. It can simultaneously project red, green, blue, and IR fringe patterns onto the measured objects, and a 2CCD camera can simultaneously acquire the visible color image along with the IR image. In combination with TPS, FTP, and the optimum three-frequency selection method, 3D shape measurement of complex surfaces such as discontinuous objects can be performed with a single measurement. Additionally, a virtual fringe projection measurement system has been established in order to correct deviations of the coaxial fringe projection system. It can be implemented in a straightforward manner for a wide range of applications where correspondence needs to be established between multiple projection systems. The simulation results and experiments on static and dynamic objects demonstrate that the developed method is suitable for fast 3D shape measurement of complex objects.
Funding
National Key R&D Program of China (2017YFF0106404); National Natural Science Foundation of China (NSFC) (51675160); Talents Project Training Funds in Hebei Province (A201500503); Innovative and Entrepreneurial Talent Project Supported by Jiangsu Province (2016A377); Engineering and Physical Sciences Research Council (EPSRC) (EP/P006930/1).
REFERENCES
1. J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recogn. 43, 2666–2680 (2010). [CrossRef]
2. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48, 133–140 (2010). [CrossRef]
3. F. Chen, G. M. Brown, and M. M. Song, “Overview of 3-D shape measurement using optical methods,” Opt. Eng. 39, 10–22 (2000). [CrossRef]
4. M. Takeda, Q. Gu, M. Kinoshita, H. Takai, and Y. Takahashi, “Frequency-multiplex Fourier-transform profilometry: a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations,” Appl. Opt. 36, 5347–5354 (1997). [CrossRef]
5. L. C. Chen, H. W. Ho, and X. L. Nguyen, “Fourier transform profilometry (FTP) using an innovative band-pass filter for accurate 3-D surface reconstruction,” Opt. Lasers Eng. 48, 182–190 (2010). [CrossRef]
6. P. S. Huang, C. P. Zhang, and F. P. Chiang, “High-speed 3-D shape measurement based on digital fringe projection,” Opt. Eng. 42, 163–168 (2003). [CrossRef]
7. X. Y. Su, G. V. Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. 98, 141–150 (1993). [CrossRef]
8. D. S. Mehta, S. K. Dubey, M. M. Hossain, and C. Shakher, “Simple multifrequency and phase-shifting fringe-projection system based on two-wavelength lateral shearing interferometry for three-dimensional profilometry,” Appl. Opt. 44, 7515–7521 (2005). [CrossRef]
9. J. P. Zhu, P. Zhou, X. Y. Su, and Z. S. You, “Accurate and fast 3D surface measurement with temporal-spatial binary encoding structured illumination,” Opt. Express 24, 28549–28560 (2016). [CrossRef]
10. C. E. Towers, D. P. Towers, and J. D. C. Jones, “Absolute fringe order calculation using optimized multi-frequency selection in full-field profilometry,” Opt. Lasers Eng. 43, 788–800 (2005). [CrossRef]
11. K. Liu, Y. C. Wang, D. L. Lau, Q. Hao, and L. G. Hassebrook, “Dual-frequency pattern scheme for high-speed 3-D shape measurement,” Opt. Express 18, 5229–5244 (2010). [CrossRef]
12. S. Zhang and S. T. Yau, “High-speed three-dimensional shape measurement system using a modified two-plus-one phase-shifting algorithm,” Opt. Eng. 46, 113603 (2007). [CrossRef]
13. S. Zhang and P. S. Huang, “High-resolution, real-time three-dimensional shape measurement,” Opt. Eng. 45, 1269–1278 (2006). [CrossRef]
14. Z. H. Zhang, D. P. Towers, and C. E. Towers, “Snapshot color fringe projection for absolute three-dimensional metrology of video sequences,” Appl. Opt. 49, 5947–5953 (2010). [CrossRef]
15. W. H. Su, “Color-encoded fringe projection for 3D shape measurements,” Opt. Express 15, 13167–13181 (2007). [CrossRef]
16. W. Y. Liu, Z. Q. Wang, G. G. Mu, and Z. L. Fang, “Color-coded projection grating method for shape measurement with a single exposure,” Appl. Opt. 39, 3504–3508 (2000). [CrossRef]
17. S. Zhang, “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48, 149–158 (2010). [CrossRef]
18. S. Zhang, D. V. D. Weide, and J. Oliver, “Superfast phase-shifting method for 3-D shape measurement,” Opt. Express 18, 9684–9689 (2010). [CrossRef]
19. J. S. Hyun, B. W. Li, and S. Zhang, “High-speed high-accuracy three-dimensional shape measurement using digital binary defocusing method versus sinusoidal method,” Opt. Eng. 56, 074102 (2017). [CrossRef]
20. P. Ou, B. W. Li, Y. J. Wang, and S. Zhang, “Flexible real-time natural 2D color and 3D shape measurement,” Opt. Express 21, 16736–16741 (2013). [CrossRef]
21. Y. J. Xu, C. Chen, S. J. Huang, and Z. H. Zhang, “Simultaneously measuring 3D shape and colour texture of moving objects using IR and colour fringe projection techniques,” Opt. Lasers Eng. 61, 1–7 (2014). [CrossRef]
22. A. H. Phan, M. L. Piao, J. H. Park, and N. Kim, “Error analysis in parallel two-step phase-shifting method,” Appl. Opt. 52, 2385–2393 (2013). [CrossRef]
23. Z. H. Zhang, C. E. Towers, and D. P. Towers, “Time efficient color fringe projection system for 3D shape and color using optimum 3-frequency selection,” Opt. Express 14, 6444–6455 (2006). [CrossRef]
24. Z. H. Zhang, S. J. Huang, S. S. Meng, F. Gao, and X. Q. Jiang, “A simple, flexible and automatic 3D calibration method for a phase calculation-based fringe projection imaging system,” Opt. Express 21, 12218–12227 (2013). [CrossRef]
25. X. X. Zhang, Y. M. Wang, S. J. Huang, N. Gao, and Z. H. Zhang, “A two-step phase-shifting algorithm for phase calculation,” Acta Photon. Sinica 46, 0311005 (2017). [CrossRef]
26. X. Y. Su and W. J. Chen, “Fourier transform profilometry: a review,” Opt. Lasers Eng. 35, 263–284 (2001). [CrossRef]
27. Z. Y. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000). [CrossRef]
28. B. Li, N. Karpinsky, and S. Zhang, “Novel calibration method for structured-light system with an out-of-focus projector,” Appl. Opt. 53, 3415–3426 (2014). [CrossRef]
29. S. J. Huang, L. L. Xie, Z. Y. Wang, Z. H. Zhang, F. Gao, and X. Q. Jiang, “Accurate projector calibration method by using an optical coaxial camera,” Appl. Opt. 54, 789–795 (2015). [CrossRef]