Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

One-shot colored reflectance direction field imaging system for optical inspection

Open Access Open Access

Abstract

Detecting microscale defects on the surface of an object is often difficult with conventional cameras. Microscale defects are known to greatly affect the bidirectional reflectance distribution function (BRDF) of light rays reflected from the surface. Therefore, an imaging system for capturing the reflectance direction field by color mapping using a multicolor filter placed in front of an imaging lens is proposed, which can have a simple structure. From the color variations of light rays passing through several different color regions of the multicolor filter, this imaging system can detect the extent of broadening of the BRDF. The effectiveness of the imaging system for optical inspection is experimentally validated by testing it on a plastic surface that has a shallow scratch with a depth of a few micrometers.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. INTRODUCTION

In many manufacturing processes, optical inspection systems for detecting defects on object surfaces have been widely developed by using conventional cameras [1,2]. However, if the defects occur at a micrometer scale in terms of their height variations, it is difficult to detect them by images captured by conventional cameras. This is because microscale defects are not well contrasted from the surrounding regions in images captured by the cameras.

The bidirectional reflectance distribution function (BRDF) describes several features of a material surface [38]. Therefore, it would be possible to detect such microscale defects if the BRDF were acquired. However, conventional BRDF measurement methods require the use of a goniometric device for moving a photonic receiver for each direction of reflected light at each point on the object surface, and such measurement with high directional resolution is time consuming. A color mapping imaging system for capturing reflectance directions, equipped with a multicolor filter, has then been developed [914]. The imaging system enables the identification of reflectance directions based on the corresponding colors at each pixel. Since the imaging system can thus capture the BRDF information by colors on an image captured in one shot, it is here called a one-shot BRDF imaging system or one-shot BRDF for brevity.

An angle distribution of the BRDF of a typical microscale defect tends to broaden. Consequently, the color variations in the image of the defect captured by the one-shot BRDF imaging system are likely to be greater than that of the flat surface. Therefore, it is possible to detect the defect by using the one-shot BRDF. However, the one-shot BRDF requires that a multicolor filter be placed at the focal plane of an imaging lens. This can be difficult to achieve in practical applications, especially when the focal length is short, since the filter should be placed close to the imaging lens and may come into contact with the lens holder.

An imaging system equipped with a multicolor filter placed in front of the imaging lens, rather than at the focal plane of the imaging lens, can also capture color variations corresponding to light rays passing through different regions of the multicolor filter. Therefore, a one-shot colored reflectance direction field (CRDF) imaging system is proposed here to detect the extent of broadening of the BRDF caused by a microscale structure with microscale height variations on the surface of an object, such as a microscale defect. The imaging system is here called a one-shot CRDF imaging system or one-shot CRDF for brevity. The one-shot CRDF, in terms of its definition, includes the one-shot BRDF, but in this paper, these two are distinguished as representing different arrangements of a multicolor filter. The remainder of this paper is organized as follows. First, the basic structure of the one-shot CRDF imaging system is described. Second, an equation describing the relationship between the color variations and the BRDF broadening is presented. In addition, a normalized color vector is introduced to analyze an image captured by the one-shot CRDF. Third, a method for detecting a microscale structure on a surface is presented. This method is validated by an experiment using a prototype of the one-shot CRDF with a white plastic plate that has a shallow scratch of a few micrometers in depth. Last, a discussion and conclusions are presented.

2. IMAGING SYSTEM FOR ONE-SHOT COLOR MAPPING OF REFLECTANCE DIRECTION FIELD

Figure 1 shows a schematic cross-sectional view of the one-shot CRDF imaging system. The one-shot CRDF consists mainly of an optical illumination system and an optical imaging system.

 figure: Fig. 1.

Fig. 1. Schematic cross-sectional view of the one-shot CRDF imaging system (one-shot CRDF). The one-shot CRDF consists mainly of an optical illumination system and an optical imaging system. The imaging system has an imaging lens and a multicolor filter. The multicolor filter is placed in front of the imaging lens, rather than at its focal plane. The optical axis of the imaging lens is set to $z$ axis in a global Cartesian coordinate system. The multicolor filter is set parallel to the $xy$ plane and has a periodically colored stripe pattern. The working distance, $h$, of the imaging system is defined as a distance between the multicolor filter and an object. The coordinate origin $O$ is in the multicolor filter.

Download Full Size | PDF

The illumination system has an LED, a pinhole, and a collimator lens that can convert the diverging light rays emitted from the LED to parallel light rays. The parallel light rays are reflected by a beam splitter and travel toward an object surface. The imaging system has an imaging lens and a periodic multicolor filter that consists of multiple stripe regions with different color transmissive regions. The multicolor filter is placed in front of the imaging lens, rather than at its focal plane. The working distance of the imaging system, defined as a distance between the multicolor filter and the object, is set to $h$. The optical axis of the imaging lens is set to the $z$ axis in a global Cartesian coordinate system. The multicolor filter is set parallel to the $xy$ plane and has translational symmetry along the $y$ axis. The coordinate origin $O$ is in the multicolor filter. Light rays reflected from the object surface pass through the beam splitter and are imaged on an image sensor as they pass through the multicolor filter. The image sensor has RGB color channels that can independently obtain red, green, and blue intensities. In this way, a light ray reflected by an object point on the object surface is imaged onto an image point on the image sensor with its color selected depending on its direction.

When the surface of the object is flat, the BRDF will have a narrow distribution. In this case, color variations of reflected light rays at an image point tend to be minimal, namely, a single color such as red, green, or blue. However, when there is a microscale structure on the surface at an object point, the BRDF will be broadened, resulting in increased color variations at the corresponding image point.

As shown in Fig. 1, a position vector ${\boldsymbol r}$, when projected onto the multicolor filter, represents a point where a light ray with an angle $\theta$ to the optical axis passes through the multicolor filter. On the other hand, a position vector ${{\boldsymbol r}_{\rm o}}$ is defined to represent an object point. From here on, these position vectors are taken as two-dimensional vectors projected onto the multicolor filter parallel to the $xy$ plane; ${\boldsymbol r}$ can then be derived based on the geometrical optics using an azimuth angle $\phi$ to the $x$ axis, position vector ${{\boldsymbol r}_{\rm o}}$, and working distance $h$ as

$${\boldsymbol r} = \left({\begin{array}{*{20}{c}}x\\y\end{array}} \right) = h\tan \theta \!\left({\begin{array}{*{20}{c}}{\cos \phi}\\{\sin \phi}\end{array}} \right) + {{\boldsymbol r}_{\rm o}}.$$

A two-dimensional angle vector ${\boldsymbol \theta}$, having two components of ${\theta _x}$ and ${\theta _y}$, is here defined as

$${\boldsymbol \theta} = \left({\begin{array}{*{20}{c}}{{\theta _x}}\\{{\theta _y}}\end{array}} \right) = \tan \theta \!\left({\begin{array}{*{20}{c}}{\cos \phi}\\{\sin \phi}\end{array}} \right).$$

The angle vector ${\boldsymbol \theta}$ can thus be written using the two position vectors ${\boldsymbol r}$ and ${{\boldsymbol r}_{\rm o}}$, according to Eqs. (1) and (2), as

$${\boldsymbol \theta}\!\left({{\boldsymbol r},{{\boldsymbol r}_{\rm o}}} \right) = \frac{{{\boldsymbol r} - {{\boldsymbol r}_{\rm o}}}}{h}.$$

3. EQUATION FOR RELATIONSHIP BETWEEN COLOR VARIATIONS AND BRDF BROADENING

A normalized direction vector of the reflected light ray, ${{\boldsymbol e}_{\rm r}}$ can be written with position vectors of ${\boldsymbol r} = (x,\;y)$ and ${{\boldsymbol r}_{\rm o}} = ({{x_{\rm o}}},\;{{y_{\rm o}}})$ using Eq. (3) under the assumption of the paraxial approximation as

$${{\boldsymbol e}_{\rm r}} = \frac{1}{{\sqrt {1 + {\theta _x^2} + {\theta _y^2}}}}\!\left({\begin{array}{*{20}{c}}{{\theta _x}}\\[6pt]{{\theta _y}}\\[6pt]1\end{array}} \right) \simeq \!\left({\begin{array}{*{20}{c}}{{\theta _x}}\\[6pt]{{\theta _y}}\\[6pt]1\end{array}} \right) = \frac{1}{h}\!\left({\begin{array}{*{20}{c}}{x - {x_{\rm o}}}\\{y - {y_{\rm o}}}\\h\end{array}} \right)\!.$$

An opposite direction vector for the incident light ray, ${{\boldsymbol e}_{\rm{in}}}$, parallel to the $z$ axis, can be written as

$${{\boldsymbol e}_{{\rm in}}} = \left({\begin{array}{*{20}{c}}0\\0\\1\end{array}} \right)\!.$$

A surface normal vector, ${\boldsymbol n}$, on an object surface, which is assumed to be parallel to the $z$ axis, is written as

$${\boldsymbol n} = \left({\begin{array}{*{20}{c}}0\\0\\1\end{array}} \right)\!.$$

Using Eqs. (4)–(6), the radiance of the reflected light ray, ${L_{\rm r}}$, can be derived with ${\rho _{\rm{BRDF}}}$, which denotes the BRDF, and the irradiance of the parallel uniform illumination, ${E_{\rm{in}}}$, as [3]

$${L_{\rm r}}\!\left({{\boldsymbol r},{{\boldsymbol r}_{\rm o}}} \right) = {\rho _{{\rm BRDF}}}\!\left({{\boldsymbol r},{{\boldsymbol r}_{\rm o}}} \right){E_{{\rm in}}}\!\left(\lambda \right){{\boldsymbol e}_{{\rm in}}} \cdot {\boldsymbol n} = {\rho _{{\rm BRDF}}}\!\left({{\boldsymbol r},{{\boldsymbol r}_{\rm o}}} \right){E_{{\rm in}}}\!\left(\lambda \right)\!,$$
where ${E_{\rm{in}}}$ is a function of the wavelength $\lambda$ of the incident light ray, and ${\rho _{\rm{BRDF}}}$ is a function of ${\boldsymbol r}$ and ${{\boldsymbol r}{\rm_o}}$; ${\rho _{\rm{BRDF}}}$ can also be transformed into a function of ${{\boldsymbol r}{\rm_o}}$, $\theta$, and $\phi$. The angle-resolved reflected radiance ${L_{\rm r}}$ is thus linearly related to the BRDF. The spectral dependence of the BRDF affects the results. This dependence, however, is often not very significant and is neglected in this paper.
 figure: Fig. 2.

Fig. 2. (a) Image of a plastic surface with a shallow scratch captured by a conventional camera. (b) Image of the same shallow scratch captured by a microscope with ${10} \times$ magnification. (c) Perspective view of the three-dimensional shape of the surface measured by scanning white light interferometry (ZYGO) in the same field of view as (b), with the height represented by color.

Download Full Size | PDF

A color vector, ${\boldsymbol I}$, which consists of three intensity components (${I_R}$, ${I_G}$, ${I_B}$) obtained from the color channels of the RGB image sensor, can be formulated using Eq. (7) under the assumption of the paraxial approximation as

$$\begin{split}{\boldsymbol I}\!\left( {{{\boldsymbol r}}_{{\rm o}}} \right) &= {{A}_{{\rm IS}}}\iint \frac{{\rm d}x{\rm d}y}{{{h}^{2}}}\\ &\quad\times\int {{\rm d}\lambda\!\left( {{E}_{{\rm in}}}\!\left( \lambda \right)\!\left( \begin{array}{*{20}{c}} {{S}_{\!{R}}}\!\left( \lambda \right) \\ {{S}_{\!{G}}}\!\left( \lambda \right) \\ {{S}_{\!{B}}}\!\left( \lambda \right) \end{array} \right)w\!\left( \lambda ,x \right){{\rho }_{\rm{BRDF}}}\!\left({\boldsymbol r},{{{\boldsymbol r}}_{{\rm o}}} \right) \right)},\end{split}$$
where ${A_{\rm{IS}}}$ is a constant value that depends on the optical properties of the imaging system. A weight factor, $w$, indicates a transmission spectrum with respect to the wavelength $\lambda$, which depends on the $x$ position in the multicolor filter; $w$ implies that the multicolor filter will change the wavelength spectrum of each reflected light ray, depending on its position of $x$. Sensitivities of the RGB image sensor, namely, ${S_R}$, ${S_G}$, and ${S_B}$ for red-, green-, and blue-light rays, indicate transmission spectra with respect to the wavelength $\lambda$.

In Eq. (8), the weight factor, $w$, has the following feature within each period of the multicolor filter:

$$w\!\left({\lambda ,{x_1} + (j - 1)\Delta x \le x \lt {x_1} + j\Delta x} \right) = {C_j}\!\left(\lambda \right),$$
where ${x_1}$ denotes the edge position in a certain period, $j$ is an integer ranging from one to $M$, ${C_j}$ denotes the transmission spectrum of each stripe in the period, and $\Delta x$ is the width of each stripe. $M$ denotes the number of stripes in the period. Note that ${C_j}$ depends on the wavelength but is independent of position $x$. When the broadening of the BRDF is narrow enough such that the dispersion range of position $x$ of light rays in the multicolor filter is less than $\Delta x$, the number of colors of the light rays becomes less than or equal to two, based on Eqs. (8) and (9). On the other hand, when the broadening of the BRDF is wide enough such that the dispersion range of position $x$ of light rays is more than ${2} \times \Delta x$, the number of colors of the light rays becomes more than or equal to three. Therefore, the extent of the broadening of the BRDF can be determined by the color variations of the light rays passing through the multicolor filter.

The amplitude of the color vector, ${\boldsymbol I}$, depends not on the colors of the light rays passing through the multicolor filter but on the illuminance intensity. On the other hand, the direction of the color vector depends only on the colors of the light rays. A normalized color vector, ${{\boldsymbol e}{\rm_c}}$, can then be used to represent color variation, which can be formulated from the color vector ${\boldsymbol I}$ at each pixel position ($i$, $j$) with integer $i$, $j$ as

$$\begin{split}{{\boldsymbol e}_{\rm c}}\!\left({i,j} \right)& = \frac{{\boldsymbol I}}{{\left| {\boldsymbol I} \right|}} = \frac{1}{{\sqrt {{{\left({{I_{\!R}}\!\left({i,j} \right)} \right)}^2} + {{\left({{I_{\!G}}\!\left({i,j} \right)} \right)}^2} + {{\left({{I_{\!B}}\!\left({i,j} \right)} \right)}^2}}}}\\&\quad\times\left({\begin{array}{*{20}{c}}{{I_{\!R}}\!\left({i,j} \right)}\\{{I_{\!G}}\!\left({i,j} \right)}\\{{I_{\!B}}\!\left({i,j} \right)}\end{array}} \right).\end{split}$$

This normalized color vector can serve as a quantitative measure of the extent to which the BRDF is broadened. When the BRDF is broadened, the end point of the normalized color vector tends to approach a certain specific region that corresponds to multiple colors, since the light rays pass through multiple different regions of the multicolor filter.

An angle between two normalized color vectors can be defined as

$$\alpha = \arccos \!\left({{{\boldsymbol e}_{\rm c}} \cdot {{\boldsymbol e}_{\rm c}^\prime}} \right)\!.$$

This angle, $\alpha$, serves as a measure of the separation between the two normalized color vectors.

4. MICROSCALE-DEFECT DETECTION METHOD AND ITS EXPERIMENTAL VALIDATION

A shallow scratch on a white plastic plate is used as an example of a microscale defect to which the one-shot CRDF detection method is applied. Figure 2 shows images of a plastic surface with a shallow scratch. On the left-hand side, (a) shows an image of the plastic surface captured by a conventional camera. In the middle, (b) shows an image of the same shallow scratch captured by a microscope with ${10}\times$ magnification. On the right-hand side, (c) shows a perspective view of the three-dimensional shape of the surface measured by scanning white light interferometry (ZYGO) in the same field of view as (b), with the height represented by color.

Figure 3 shows a CAD design of a prototype of the one-shot CRDF imaging system. An imaging lens (Kenko Tokina, KCM-Z4D) with a working distance of 90 mm is used. A multicolor filter is designed to have periodic color variations along the $x$ axis, where each period consists of four plural color transmission spectra, namely, red, green, blue, and green. The periodic length of the filter is set to 3.0 mm. A divergence angle of the illumination light is set to 0.10°. An image sensor (FLIR) has RGB color channels (i.e., three color channels) with ${2048} \times {2448}$ pixels.

 figure: Fig. 3.

Fig. 3. CAD design of a prototype of the one-shot CRDF imaging system. An imaging lens (Kenko Tokina, KCM-Z4D) with a working distance of 90 mm is used. A multicolor filter is designed to have periodic color variations along the $x$ axis, where each period consists of four plural color transmission spectra, namely, red, green, blue, and green. The periodic length of the filter is set to 3.0 mm. A divergence angle of the illumination light is set to 0.10°. An image sensor has RGB color channels (i.e., three color channels).

Download Full Size | PDF

 figure: Fig. 4.

Fig. 4. Result of imaging the plastic surface by using the one-shot CRDF prototype. (a) Image captured by a conventional camera. (b) Unprocessed image captured by the one-shot CRDF. Each image has a field of view of ${4.9}\;{\rm mm} \times {5.8}\;{\rm mm}$.

Download Full Size | PDF

Figure 4 shows a result of imaging the plastic surface by using the one-shot CRDF prototype. On the left-hand side, (a) shows an image captured by a conventional camera. On the right-hand side, (b) shows an unprocessed image captured by the one-shot CRDF. Each image has a field of view of ${4.9}\;{\rm mm} \times {5.8}\;{\rm mm}$. The image captured by the one-shot CRDF reveals significant color variations on the shallow scratch, despite its depth being only a few micrometers. However, in addition to the color variations caused by the shallow scratch, a striped colored background is generated due to the light rays passing through the multicolor filter.

Figure 5 shows an image of a flat plastic surface without any shallow scratches, captured by the one-shot CRDF prototype. This image with a field of view of ${0.2}\;{\rm mm} \times {5.8}\;{\rm mm}$ is cropped from an original image and will be used as a background image for the following analysis.

 figure: Fig. 5.

Fig. 5. Image of a flat plastic surface without any shallow scratches, captured by the one-shot CRDF prototype. This image with a field of view of ${0.2}\;{\rm mm} \times {5.8}\;{\rm mm}$ is cropped from an original image and will be used as a background image for the following analysis.

Download Full Size | PDF

Figure 6 shows three-dimensional scatter plots of normalized color vector distributions for (a) the background as shown in Fig. 5 and (b) the surface with the shallow scratch as shown on the right-hand side in Fig. 4. The normalized color vector is defined by Eq. (10). The components for the three RGB channels are taken in Cartesian coordinates. Two-dimensional scatter plots projected onto the RG plane are also plotted. The blue circle indicates the background, and the red dot indicates the surface with the shallow scratch. In the plot shown in (a), three regions are indicated, namely, R, G, and B regions, which correspond to red, green, and blue colored light rays, respectively. Two additional regions, RG and BG, indicate the colored light rays passing through the boundary between the red and green regions in the multicolor filter and the boundary between the blue and green regions, respectively. In the plot shown in (b), a multiple color region is indicated. This multiple color region is generated by the mixture of colored light rays passing through multiple different regions of the multicolor filter, as the BRDF is broadened by the shallow scratch.

 figure: Fig. 6.

Fig. 6. Three-dimensional scatter plots of normalized color vector distributions for the (a) background as shown in Fig. 5 and (b) surface with the shallow scratch as shown on the right-hand side in Fig. 4. The components for the three RGB channels are taken in Cartesian coordinates. Two-dimensional scatter plots projected onto the RG plane are also plotted. The normalized color vector is defined by Eq. (10). The blue circle indicates the background, and red dot indicates the surface with the shallow scratch.

Download Full Size | PDF

On the left-hand side of Fig. 7, (a) shows three-dimensional scatter plots of normalized color vectors for both the surface with the shallow scratch and the flat surface (background). The components for the three RGB channels are taken in Cartesian coordinates. The red dot indicates the surface with the shallow scratch, and the blue circle indicates the background. The angle values of $\alpha$, defined by Eq. (11), indicate the separation measure between a color vector of the surface with the shallow scratch and its closest color vector in the background. From top to bottom, values of $\alpha$ are set to more than 0.5°, 1.0°, and 3.0°, respectively. On the right-hand side of Fig. 7, (b) shows images for the surface with the shallow scratch. In these images, a pixel value is set to zero at each pixel with a color vector that has an $\alpha $ value of less than or equal to 0.5°, 1.0°, and 3.0° from top to bottom, respectively.

 figure: Fig. 7.

Fig. 7. (a) Three-dimensional scatter plots of normalized color vectors for both the surface with the shallow scratch and the flat surface (background). The components for the three RGB channels are taken in Cartesian coordinates. The red dot indicates the surface with the shallow scratch, and blue circle indicates the background. The angle values of $\alpha$, defined by Eq. (11), indicate the separation measure between a color vector of the surface with the shallow scratch and its closest color vector in the background. From top to bottom, $\alpha$ is set to more than 0.5°, 1.0°, and 3.0°, respectively. (b) Images for the surface with the shallow scratch where a pixel value is set to zero at each pixel with a color vector that has an $\alpha$ value of less than or equal to 0.5°, 1.0°, and 3.0° from top to bottom, respectively.

Download Full Size | PDF

It can be observed that the background can be eliminated by increasing the value of $\alpha$ while the shallow scratch remains. In other words, the shallow scratch can be extracted by increasing the $\alpha$ value. This implies that the one-shot CRDF can clearly reveal a microscale structure on the surface of an object using this extraction method with the separation angle $\alpha$.

5. DISCUSSION

The one-shot CRDF imaging system, equipped with the multicolor filter placed in front of the imaging lens rather than at the focal plane of the imaging lens, needs the elimination process of the background using the separation measure of color vectors. On the other hand, the imaging system with the multicolor filter placed at the focal plane of the imaging lens does not require the elimination process. This implies that the one-shot CRDF may require more time due to the additional elimination process. However, the one-shot CRDF can have a simple structure and broad applicability since it can be easily implemented by attaching a multicolor filter to any conventional camera. Furthermore, the elimination process can independently handle individual pixels of an image captured by the one-shot CRDF, which enables faster parallel processing.

The width of each stripe color region of the multicolor filter is chosen as follows. Light rays reflected from a standard surface without any defects should pass through just one stripe color region or, at most, two stripe color regions corresponding to the boundary between two adjacent stripe regions. On the other hand, light rays reflected from defects should pass through three or more stripe color regions.

Microscale defects on smooth surfaces, such as scratches with depths of a few micrometers on mirrors or unevenness with height variations of a few micrometers on polished metal plates, can be relatively easily detected by observing the change in color variations caused by the broadening of the BRDF resulting from the defects. However, detecting defects on rough surfaces, such as a shallow scratch on a plastic plate with a rough surface, becomes difficult due to the small difference in the broadening of the BRDF between the shallow scratch and the rough surface.

The one-shot CRDF may also be applicable for probing micro-scale properties inside a transmissive object such as stress tensor distribution, temperature distribution, and refractive index distribution [1522].

6. CONCLUSIONS

A one-shot CRDF imaging system for capturing a reflectance direction field by color mapping using a multicolor filter placed in front of an imaging lens is proposed. The one-shot CRDF can have a simple structure and wide-ranging applicability since it can be easily implemented by attaching a multicolor filter to any conventional camera.

From color variations of colored light rays passing through several different color regions of the multicolor filter, the one-shot CRDF can obtain the extent of broadening of the BRDF. The effectiveness of the imaging system for optical inspection is experimentally validated by testing it on a plastic surface having a shallow scratch with a depth of a few micrometers. The image captured by a prototype of the one-shot CRDF reveals significant color variations on the shallow scratch, despite its depth being only a few micrometers. However, in addition to the color variations caused by the shallow scratch, a striped colored background is generated owing to the light rays passing through the multicolor filter. The background is shown to be eliminated using the separation angle represented by Eq. (11). This implies that the one-shot CRDF can be used for detecting microscale structures on object surfaces in optical inspections.

Disclosures

The author declares no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the author upon reasonable request.

REFERENCES

1. J. Karangwa, L. Kong, D. Yi, and J. Zheng, “Automatic optical inspection platform for real-time surface defects detection on plane optical components based on semantic segmentation,” Appl. Opt. 60, 5496–5506 (2021). [CrossRef]  

2. R. J. Woodham, “Gradient and curvature from the photometric-stereo method, including local confidence estimation,” J. Opt. Soc. Am. A 11, 3050–3068 (1994). [CrossRef]  

3. B. K. P. Horn and R. W. Sjoberg, “Calculating the reflectance map,” Appl. Opt. 18, 1770–1779 (1979). [CrossRef]  

4. L. Simonot and G. Obein, “Geometrical considerations in analyzing isotropic or anisotropic surface reflections,” Appl. Opt. 46, 2615–2623 (2007). [CrossRef]  

5. S. D. Butler, S. E. Nauyoks, and M. A. Marciniak, “Comparison of microfacet BRDF model to modified Beckmann-Kirchhoff BRDF model for rough and smooth surfaces,” Opt. Express 23, 29100–29112 (2015). [CrossRef]  

6. I. G. E. Renhorn and G. D. Boreman, “Developing a generalized BRDF model from experimental data,” Opt. Express 26, 17099–17114 (2018). [CrossRef]  

7. I. G. E. Renhorn and G. D. Boreman, “Analytical fitting model for rough-surface BRDF,” Opt. Express 16, 12892–12898 (2008). [CrossRef]  

8. S. D. Butler, S. E. Nauyoks, and M. A. Marciniak, “Experimental analysis of bidirectional reflectance distribution function cross section conversion term in direction cosine space,” Opt. Lett. 40, 2445–2448 (2015). [CrossRef]  

9. H. Ohno, “One-shot three-dimensional measurement method with the color mapping of light direction,” OSA Contin. 4, 840–848 (2021). [CrossRef]  

10. H. Ohno, “One-shot color mapping imaging system of light direction extracted from a surface BRDF,” OSA Contin. 3, 3343–3350 (2020). [CrossRef]  

11. H. Ohno and H. Kano, “Depth reconstruction with coaxial multi-wavelength aperture telecentric optical system,” Opt. Express 26, 25880–25891 (2018). [CrossRef]  

12. H. Ohno and T. Kamikawa, “One-shot BRDF imaging system to obtain surface properties,” Opt. Rev. 28, 655–661 (2021). [CrossRef]  

13. H. Ohno, A. Ohno, and H. Okano, “Imaging technology to immediately obtain clear images of microdefects,” Toshiba Rev. 76, 38–41 (2021).

14. H. Ohno, “Method for instant measurement of 3D microdefect shapes using optical imaging system capable of color mapping of light direction,” Toshiba Rev. 77, 44–47 (2022).

15. H. Ohno and K. Toya, “Reconstruction method of axisymmetric refractive index fields with background-oriented schlieren,” Appl. Opt. 57, 9062–9069 (2018). [CrossRef]  

16. H. Ohno and K. Toya, “Scalar potential reconstruction method of axisymmetric 3D refractive index fields with background-oriented schlieren,” Opt. Express 27, 5990–6002 (2019). [CrossRef]  

17. H. Ohno and K. Toya, “Localized gradient-index field reconstruction using background-oriented schlieren,” Appl. Opt. 58, 7795–7803 (2019). [CrossRef]  

18. H. Ohno and T. Usui, “Gradient-index dark hole based on conformal mapping with etendue conservation,” Opt. Express 27, 18493–18507 (2019). [CrossRef]  

19. H. Ohno, Y. Shiomi, S. Tsuno, and M. Sasaki, “Design of a transmissive optical system of a laser metal deposition three-dimensional printer with metal powder,” Appl. Opt. 58, 4127–4138 (2019). [CrossRef]  

20. H. Ohno, “Symplectic ray tracing based on Hamiltonian optics in gradient-index media,” J. Opt. Soc. Am. A 37, 233–411 (2020). [CrossRef]  

21. H. Ohno and T. Usui, “Points-connecting neural network ray tracing,” Opt. Lett. 46, 4116–4119 (2021). [CrossRef]  

22. H. Ohno and T. Usui, “Neural network gradient-index mapping,” OSA Contin. 4, 2543–2551 (2021). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the author upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Schematic cross-sectional view of the one-shot CRDF imaging system (one-shot CRDF). The one-shot CRDF consists mainly of an optical illumination system and an optical imaging system. The imaging system has an imaging lens and a multicolor filter. The multicolor filter is placed in front of the imaging lens, rather than at its focal plane. The optical axis of the imaging lens is set to $z$ axis in a global Cartesian coordinate system. The multicolor filter is set parallel to the $xy$ plane and has a periodically colored stripe pattern. The working distance, $h$, of the imaging system is defined as a distance between the multicolor filter and an object. The coordinate origin $O$ is in the multicolor filter.
Fig. 2.
Fig. 2. (a) Image of a plastic surface with a shallow scratch captured by a conventional camera. (b) Image of the same shallow scratch captured by a microscope with ${10} \times$ magnification. (c) Perspective view of the three-dimensional shape of the surface measured by scanning white light interferometry (ZYGO) in the same field of view as (b), with the height represented by color.
Fig. 3.
Fig. 3. CAD design of a prototype of the one-shot CRDF imaging system. An imaging lens (Kenko Tokina, KCM-Z4D) with a working distance of 90 mm is used. A multicolor filter is designed to have periodic color variations along the $x$ axis, where each period consists of four plural color transmission spectra, namely, red, green, blue, and green. The periodic length of the filter is set to 3.0 mm. A divergence angle of the illumination light is set to 0.10°. An image sensor has RGB color channels (i.e., three color channels).
Fig. 4.
Fig. 4. Result of imaging the plastic surface by using the one-shot CRDF prototype. (a) Image captured by a conventional camera. (b) Unprocessed image captured by the one-shot CRDF. Each image has a field of view of ${4.9}\;{\rm mm} \times {5.8}\;{\rm mm}$.
Fig. 5.
Fig. 5. Image of a flat plastic surface without any shallow scratches, captured by the one-shot CRDF prototype. This image with a field of view of ${0.2}\;{\rm mm} \times {5.8}\;{\rm mm}$ is cropped from an original image and will be used as a background image for the following analysis.
Fig. 6.
Fig. 6. Three-dimensional scatter plots of normalized color vector distributions for the (a) background as shown in Fig. 5 and (b) surface with the shallow scratch as shown on the right-hand side in Fig. 4. The components for the three RGB channels are taken in Cartesian coordinates. Two-dimensional scatter plots projected onto the RG plane are also plotted. The normalized color vector is defined by Eq. (10). The blue circle indicates the background, and red dot indicates the surface with the shallow scratch.
Fig. 7.
Fig. 7. (a) Three-dimensional scatter plots of normalized color vectors for both the surface with the shallow scratch and the flat surface (background). The components for the three RGB channels are taken in Cartesian coordinates. The red dot indicates the surface with the shallow scratch, and blue circle indicates the background. The angle values of $\alpha$, defined by Eq. (11), indicate the separation measure between a color vector of the surface with the shallow scratch and its closest color vector in the background. From top to bottom, $\alpha$ is set to more than 0.5°, 1.0°, and 3.0°, respectively. (b) Images for the surface with the shallow scratch where a pixel value is set to zero at each pixel with a color vector that has an $\alpha$ value of less than or equal to 0.5°, 1.0°, and 3.0° from top to bottom, respectively.

Equations (11)

Equations on this page are rendered with MathJax. Learn more.

r = ( x y ) = h tan θ ( cos ϕ sin ϕ ) + r o .
θ = ( θ x θ y ) = tan θ ( cos ϕ sin ϕ ) .
θ ( r , r o ) = r r o h .
e r = 1 1 + θ x 2 + θ y 2 ( θ x θ y 1 ) ( θ x θ y 1 ) = 1 h ( x x o y y o h ) .
e i n = ( 0 0 1 ) .
n = ( 0 0 1 ) .
L r ( r , r o ) = ρ B R D F ( r , r o ) E i n ( λ ) e i n n = ρ B R D F ( r , r o ) E i n ( λ ) ,
I ( r o ) = A I S d x d y h 2 × d λ ( E i n ( λ ) ( S R ( λ ) S G ( λ ) S B ( λ ) ) w ( λ , x ) ρ B R D F ( r , r o ) ) ,
w ( λ , x 1 + ( j 1 ) Δ x x < x 1 + j Δ x ) = C j ( λ ) ,
e c ( i , j ) = I | I | = 1 ( I R ( i , j ) ) 2 + ( I G ( i , j ) ) 2 + ( I B ( i , j ) ) 2 × ( I R ( i , j ) I G ( i , j ) I B ( i , j ) ) .
α = arccos ( e c e c ) .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.