Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Self-masking of an optical image in dense fog for an extended object

Open Access Open Access

Abstract

Object image quality degrades in fog due to optical scattering and attenuation. We present some unique features for optical imaging of an extended object in dense fog. As the optical imaging is mainly arising from ballistic photons in fog, the scattering photons generate a background that reduces the contrast of the optical image. Self-masking effect is observed for an extended emanating object such that the image recognizability deteriorates with the order that the enclosed structure disappears first and outside boundaries persist longer as the visibility is reduced. The Monte Carlo simulation is carried out for a practical optical imaging system in a fog chamber with a point source, sparsely distributed sources, and with an extended emanating object, and the image process presents some salient features. The experiment conducted in the fog chamber demonstrates good agreement between the numerical simulation and experimental results.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Object recognizability and optical image quality deteriorate in atmosphere by fog, aerosols and turbulence. As light passes through scattering medium, the output consists of ballistic, snake and multiply scattered photons. Direct imaging with ballistic photons is attenuated when passing through the scattering medium, and image background is increased by the scattered photons [1]. Although the scattered photons carry the imaging information, the image recovery from the scattering photons needs extra computational effort or calibration by point spread function (PSF). For direct optical imaging with ballistic photons, the scattered photons contribute in a form of a noise background [2,3]. It is necessary to investigate the interplay between the optical image quality and the object illuminating structure in the scattering medium with the purpose to recognize the object better with computer vision.

There have been a few analytical solutions for the atmospheric light scattering [4,5], and various analytical approximations were proposed in limited cases [611]. Eismann et al. developed an aerosol modulation transfer function to assess the impact of aerosol scattering on passive long-range imaging sensors [12]. Hanafy et al. used electromagnetic model to compute the average PSF and proposed that a reasonable image of a distant source can be obtained as long as direct radiation is greater than scattered radiation [13,14]. Monte Carlo method, which refers to a technique to simulate physical processes using a stochastic model, is commonly used to study the propagation of light in the atmosphere [1519]. Dor et al. suggested a semi-analytical physical model to describe the PSF of a distant point source (1 km) in the high-visibility atmosphere (≥ 500 m). The model explained how the attenuation length, the focus length and other characteristics of the imaging system affected the measured PSF [20]. The recovery of imaging under extreme scattering conditions by PSF has also been intensively studied [2123]. Ma et al. implemented a full analysis of the multiple-scattering effect of atmosphere aerosols on light-transmission measurements based on Monte Carlo method [24], but the properties of imaging system were not taken into account.

In this paper, we first present the imaging model considering the contributions by signal and noise both generated by the same object. The PSF of the optical image system in dense fog is then simulated and experimentally verified in a 20-meter-long fog chamber. Through the convolution of target and PSF, we discuss the influence of scattering on the recognizability of an emanating object. A prominent object image recognizability transition is observed via numerical simulation. To test the proposed theory, an object is imaged in the fog chamber, which is found to be in agreement with our numerical simulation.

2. Imaging model and PSF simulation

The mechanism to simulate PSF based on Monte Carlo method is shown in Fig. 1, while a point light source through dense fog is simulated and detected. The scattering effect of the fog on photon is equated to a collision between photon and uniform spherical particles. After each collision, the propagation direction of the photon is diverted. By tracking the motion state of photon, the photon within the field of view (FOV) of the imaging system is captured and collected. These photons are mapped to the image plane through the action of a single convex lens. By counting the radial distribution of photons hitting on the image plane, the PSF is obtained.

 figure: Fig. 1.

Fig. 1. PSF simulation based on Monte Carlo method.

Download Full Size | PDF

The distribution of the total detected PSF at the image plane consists of ballistic light and scattered light respectively [20,25,26]:

$$\begin{array}{{c}} {PSF = PS{F_{bal}} + PS{F_{sca}}.} \end{array}$$

If we further consider the theory of the superposition linear system, the intensity distribution of the light field under incoherent illumination is described by a convolution [22]:

$$\begin{array}{{c}} {I({{x_i},{y_i}} )= \mathrm{\int\!\!\!\int }PSF({{x_i},{y_i};{x_o},{y_o}} )\times O({{x_o},{y_o}} )d{x_o}d{y_o},} \end{array}$$
where $O({{x_o},{y_o}} )$ is the spatial distribution of the emanating object, which can be an illuminating light emitting diodes or light reflected by an object from ambient illumination, $I({{x_i},{y_i}} )$ is the output image transmitted through dense fog.

Using Eq. (1), Eq. (2) can be written as

$$\begin{array}{{c}} {I({{x_i},{y_i}} )= \mathrm{\int\!\!\!\int }\left( {\begin{array}{{c}} {PS{F_{bal}}({{x_i},{y_i};{x_o},{y_o}} )+ PS{F_{sca}}({{x_i},{y_i};{x_o},{y_o}} )} \end{array}} \right) \times O({{x_o},{y_o}} )d{x_o}d{y_o}.} \end{array}$$

The PSF of ballistic light should be a Dirac function [27], thus

$$\begin{aligned} {\begin{array}{{c}} {I({{x_i},{y_i}} )= D \times O\left( {\frac{v}{L}{x_o},\frac{v}{L}{y_o}} \right) + \mathrm{\int\!\!\!\int }PS{F_{sca}}({{x_i},{y_i};{x_o},{y_o}} )\times O({{x_o},{y_o}} )d{x_o}d{y_o}}\\ = { {I_{sig}}({{x_i},{y_i}} )+ {I_{noi}}({{x_i},{y_i}} ),} \end{array}} \end{aligned}$$
where D is a constant that is related to the system configuration, v is the image distance, L is the object distance. The second term ${I_{noi}}({{x_i},{y_i}} )$ is a noise from the object, since the FOV is larger than the memory effect in dense fog [28,29]. As the noise level ${I_{noi}}({{x_i},{y_i}} )$ increases for a larger emanating object, the contrast of the image transmitted through dense fog decreases, and the target can no longer be identified from the background when the intensity difference between ${I_{sig}}({{x_i},{y_i}} )$ and ${I_{noi}}({{x_i},{y_i}} )$ is less than the accuracy of the analog-to-digital output on the camera pixels.

According to the derivation, the signal from the target itself partly degrades to noise and masks the information signal reaching the detector. This effect is referred to as the optical self-masking effect, drawing on the nomenclature for the degradation of original information into noise in the field of radar and signal systems [30,31].

3. Optical self-masking effect

In the simulation, the following parameters are applied. The distance from the light source to the detector is 20 m. For chromatic light, each wavelength of light wave has its corresponding visibility. In this work, the central wavelength of visible light at 550 nm is selected as an example to study the imaging properties as a function of the ratio of imaging distance to visibility. The extinction coefficient under 4 m visibility can be calculated by ${k_e} ={-} \textrm{ln}({0.05} )/(V )$ as 0.75 ${\textrm{m}^{ - 1}}$, where V is the visibility. The direction of photon emission from the point source conforms to a uniform distribution, and the number of emitted photons is 3×1012. The scattering angle of the photons is obtained by sampling the Henyey-Greenstein scattering phase function. The asymmetry factor of the transmission media is typically 0.9 with visible light [32]. The pixel size of detector is 3.75 µm with 2048 × 2048 resolution. The convex lens has a constant optical aperture diameter of 90 mm and a focal length of 35 mm. In the simulation, the exponential attenuation law of ballistic light is selected as the measurement of simulation error. When the number of simulation photons is 3×1012, the simulation error is 0.03%.

By counting the radial distribution of photons hitting on the image plane, the $\textrm{PSF}(\textrm{r} )$ can be obtained, as shown in Fig. 2(a). According to the centro-symmetric property of the model [33], the 2-D PSF can be calculated by mathematical method, as shown in Fig. 2(b).

 figure: Fig. 2.

Fig. 2. Simulated normalized (a) 1-D and (b) 2-D PSF.

Download Full Size | PDF

Figure 2 shows that the detected PSF is composed of two components: a narrow central peak representing ballistic photons and a weak wide-spread wing representing scattered photons. A good agreement is reflected in the comparison of the PSF obtained in our work with the previously results [20].

The object is composed of three LEDs arranged at equal interval. The image of the target is recorded by a camera (Exmor R CMOS; pixel size: 3.75 µm × 3.75 µm; resolution: 6240 × 4160; 35 mm lens: SONY SEL35F1.4GM) 20 meters away. In addition, considering the pixel size and Nyquist sampling law, the LED arrays are set at 8 mm intervals, as shown in the Fig. 3(a). In absence of fog, the images of the three equally spaced LEDs can be just separated. Figure 3(b) shows the simulated scattered imaging result of the LEDs, which is the convolution of the non-scattered image and the PSF obtained previously. For intuitive observing the variation of target’s intensity, the green line in Fig. 3(d) shows the cross-sectional intensity of line marked in Fig. 3(b). Three equally spaced LEDs can still be recognized through strong scattering with 4 m visibility.

 figure: Fig. 3.

Fig. 3. Results of simulation and experiment of target region. (a) Raw photo captured by the camera without scattering. (b) Simulated imaging in dense fog with 4 m visibility. (c) Experimental imaging in dense fog with 4 m visibility. (d) The cross-sectional intensity of corresponding line of interest.

Download Full Size | PDF

To verify the correctness of the simulation process, the experiment was carried out in a 20 m × 3 m × 3 m fog chamber which has the ability to consistently provide different visibility [34]. The visibility in the fog chamber was measured by a visibility meter (HUAYUN SOUNDING DNQ1 Visibility Meter), with 10% detection error. The target and imaging system used in the experiment were consistent with those in the simulation. Under 4 m visibility, the image shoot in the fog chamber and the cross-sectional intensity of line marked are shown in Fig. 3(c) and 3(d) as blue line respectively. The simulations showed good agreement with the experiments.

Simulation results show that the ratio of ballistic to scattered light received by the image plane is about 1:1000. Even with such a small percentage of ballistic light as shown in Fig. 3(d), the objects are still recognizable. Although the scattered photons are numerous, they are spatially dispersed and only play a role in elevating the background noise from 0.02 to 0.46 and 0.60.

For simple targets captured through dense fog, a small amount of ballistic light presents a much stronger signal performance than scattered light, so that the recognition capability of optical imaging is not strongly affected by the presence of fog.

In general, extended object may be illuminated by the ambient light, or the object is illuminating light source itself, such as an LED. Therefore, it is interesting to focus on the evolution of the image recognizability in dense fog as the size of the object increases. In the real world, object may consist of streetlamps or maybe illuminated by reflection, presenting different shapes and sizes for the object. Scattered light caused by the object itself would accumulate the background and spoil the object recognizability.

Supposing there is a target “H” (31.1 cm × 14.6 cm) placed in the center of circular luminous area, the objects are different by just changing the fill rate of the target as shown in Fig. 4(a)–4(d). In addition, the imaging system parameters in the simulation are the same as previous system. Figure 4(e)–4(h) present the region of interest (248 pixels × 248 pixels) from simulation results under 4 m visibility dense fog. For a small fill rate in Fig. 4(e), the contour of the target “H” can be clearly distinguished from the circular area. Figure 4(f)–4(h) illustrates the gradual degradation of the target “H” as the illuminated area is extended. Considering that the bit depth of the detector is 8 bits, the accuracy of the analog-to-digital output on the pixel is $\textrm{1/}{\textrm{2}^\textrm{8}} \approx \textrm{0}\textrm{.0039}$. Figure 5 presents the local contrast of target for different illumination area. Any numerical differences smaller than this accuracy are lost during the image readout from analog-to-digital converter, resulting in the target “H” no longer discernible from the illumination area and the “H” not being recovered using any image processing algorithm.

 figure: Fig. 4.

Fig. 4. Simulation of object in dense fog under 4 m visibility. (a)-(d) Objects constructed by a certain size target “H” ($\textrm{I}$) with different size circular luminous areas ($\textrm{II}$). (e)-(h) Simulation result under 4 m visibility.

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. Local contrast variation of target “H”.

Download Full Size | PDF

As the luminous area expands, the scattered light becomes more intense due to the increased background contributed by each point source. On the other hand, image “H” is contributed by ballistic photons and remains the same signal strength as it is independent of the illumination area. As a result, the image contrast is reduced for increased illumination area. The same imaging effect was also shown in the photomicrography under the Kohler illumination scheme [35]. The contrast is eventually reduced to a level lower than the detection limit of the imaging system, and image “H” is no longer visible, giving rise to an optical self-masking effect.

To verify the mechanism of the self-masking effect, simulation and experiment were carried out with a fog chamber. The target was a jet fighter-model, which was illuminated by a searchlight placed in the fog chamber during the experiment. The target was captured by a camera (FLIR BFS-PGE-51S5P-C; pixel size: 3.45 µm × 3.45 µm; resolution: 2448 × 2048; 100 mm lens: Zeiss Milvus 2/100 mm) 20 meters away. In addition, the angle between the illumination beam and the camera to the target was large enough to reduce interference of backscattered light. The image of the jet fighter-model captured by the imaging system under high visibility is shown in Fig. 6(a). Figures 6(b)–6(c) and 6(e)–6(f) show the simulated and experimental results of target imaging, respectively. Figure 6(b) and 6(f) show that only the bomb bay is unrecognizable under higher visibility. Figure 6(c) and 6(f) show that neither the fuselage nor bomb bay is recognizable under lower visibility. As shown in Fig. 6(d), the local contrast of the fuselage is always greater than the contrast of the bomb bay while their initial contrasts are identical without fog. As the visibility decreases, the bomb bay recognizability decreases until it disappears first, followed by the fuselage. The experiment and simulation are both consistent with the prediction of the self-masking effect on the image degradation process. That is, the scattered light from the illuminated object accumulates a strong background so that the enclosed structures cannot be observed. As shown in Fig. 6(c) and 6(f), the target signal is completely covered by the background as the visibility decreased. In addition, part of the scattered interference from a searchlight in the experiment will reduce the contrast of the target image, resulting in additional image masking and degradation.

 figure: Fig. 6.

Fig. 6. Simulation and experiment results of target imaging. (a) The jet fighter-model captured by the imaging system under high visibility. (b), (c) The simulation results under 8 m and 7 m visibility, respectively. (d) Local contrast of fuselage and bomb bay in simulation and experiment. (e), (f) The experiment results under 9 m and 8 m visibility, respectively.

Download Full Size | PDF

4. Conclusion

The light from the target becomes a masking signal through the low-visibility atmosphere, interfering with the identification of the target itself. This process, called optical self-masking effect, is a demonstrated new mechanism for fog causing unrecognizability. The self-masking effect dominates the unrecognizability process at extreme low visibility. The excess scattered light component from the extended illumination obscures the ballistic light component from the target itself, leading to a decrease of the target contrast. With this observation, we explain that the enclosed structures are more likely to be inundated in the dense fog than its outside boundary. This self-masking may play a vital role in the observation of incomplete optical imaging in atmosphere. We believe that the proposed mechanism should serve as a new guidance for restoring optical image through strongly scattering media as well as a practical criterion to evaluate the restored image quality in terms of structure similarity.

Funding

National Natural Science Foundation of China (12074444, 61991452); Guangdong Major Project of Basic and Applied Basic Research (2020B0301030009); Basic and Applied Basic Research Foundation of Guangdong Province (2020A1515011184); Guangzhou Basic and Applied Basic Research Foundation (202102020987).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this Letter are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. B. J. Redman, J. D. van der Laan, K. R. Westlake, J. W. Segal, C. F. LaCasse, A. L. Sanchez, and J. B. Wright, “Measuring resolution degradation of long-wavelength infrared imagery in fog,” Opt. Eng. 58(05), 1 (2019). [CrossRef]  

2. P. Sebbah, Waves and Imaging through Complex Media (Kluwer Academic pub, Netherlands2001).

3. E. J. McCartney, Optics of the atmosphere. Scattering by molecules and particles, Optics of the atmosphere. Scattering by molecules and particles (Wiley, 1976), p. xv+408.

4. C. J. R. Sheppard, “Scattering and the Spatial Frequency Representation,” in Light Scattering and Nanoscale Surface Roughness, A. A. Maradudin, ed. (Springer US, Boston, MA, 2007), pp. 61–92.

5. A. Ishimaru, “Limitation on image resolution imposed by a random medium,” Appl. Opt. 17(3), 348–352 (1978). [CrossRef]  

6. D. Sadot and N. S. Kopeika, ““Imaging through the atmosphere: practical instrumentation-based theory and verification of aerosol modulation transfer function,” J. Opt. Soc. Am. A 10(1), 172–179 (1993). [CrossRef]  

7. Y. Kuga, A. Ishimaru, H. W. Chang, and L. Tsang, “Comparisons between the small-angle approximation and the numerical solution for radiative transfer theory,” Appl. Opt. 25(21), 3803–3805 (1986). [CrossRef]  

8. R. F. Lutomirski, “Atmospheric degradation of electrooptical system performance,” Appl. Opt. 17(24), 3915–3921 (1978). [CrossRef]  

9. Y. J. Kaufman, “Atmospheric effect on spatial resolution of surface imagery: errata,” Appl. Opt. 23(22), 4164–4172 (1984). [CrossRef]  

10. Y. J. Kaufman, “Solution of the equation of radiative transfer for remote sensing over nonuniform surface reflectivity,” J. Geophys. Res. 87(C6), 4137–4147 (1982). [CrossRef]  

11. D. J. Diner and J. V. Martonchik, “Influence of Aerosol Scattering on Atmospheric Blurring of Surface Features,” IEEE Trans. Geosci. Remote Sensing 23(5), 618–624 (1985). [CrossRef]  

12. M. T. Eismann and D. A. LeMaster, “Aerosol modulation transfer function model for passive long-range imaging over a nonuniform atmospheric path,” Opt. Eng. 52(4), 046201 (2013). [CrossRef]  

13. M. E. Hanafy, M. C. Roggemann, and D. O. Guney, “Detailed effects of scattering and absorption by haze and aerosols in the atmosphere on the average point spread function of an imaging system,” J. Opt. Soc. Am. A 31(6), 1312–1319 (2014). [CrossRef]  

14. M. E. Hanafy, M. C. Roggemann, and D. O. Guney, “Reconstruction of images degraded by aerosol scattering and measurement noise,” Opt. Eng. 54(3), 033101 (2015). [CrossRef]  

15. P. Bruscaglioni, P. Donelli, A. Ismaelli, and G. Zaccanti, “A numerical procedure for calculating the effect of a turbid medium on the MTF of an optical system,” J. Mod. Opt. (UK 38(1), 129–142 (1991). [CrossRef]  

16. P. Bruscaglioni, P. Donelli, A. Ismaelli, and G. Zaccanti, “Monte Carlo calculations of the modulation transfer function of an optical system operating in a turbid medium,” Appl. Opt. 32(15), 2813–2824 (1993). [CrossRef]  

17. P. Chervet, C. Lavigne, A. Roblin, and P. Bruscaglioni, “Effects of aerosol scattering phase function formulation on point-spread-function calculations,” Appl. Opt. 41(30), 6489–6498 (2002). [CrossRef]  

18. G. I. Marchuk, G. A. Mikhailov, M. A. Nazaraliev, R. A. Darbinjan, B. A. Kargin, and B. S. Elepov, Monte Carlo methods in atmospheric optics, Monte Carlo methods in atmospheric optics (Springer-Verlag, 1980), p. viii+208.

19. S. A. Prahl, M. Keijzer, S. L. Jacques, and A. J. Welch, “A Monte Carlo model of light propagation in tissue,” Proc. SPIE (USA) 10305, 1030509 (2017). [CrossRef]  

20. B. Ben Dor, A. D. Devir, G. Shaviv, P. Bruscaglioni, P. Donelli, and A. Ismaelli, “Atmospheric scattering effect on spatial resolution of imaging systems,” J. Opt. Soc. Am. A 14(6), 1329–1337 (1997). [CrossRef]  

21. H. He, Y. Guan, and J. Zhou, “Image restoration through thin turbid layers by correlation with a known object,” Opt. Express 21(10), 12539–12545 (2013). [CrossRef]  

22. X. S. Xie, H. C. Zhuang, H. X. He, X. Q. Xu, H. W. Liang, Y. K. Liu, and J. Y. Zhou, “Extended depth-resolved imaging through a thin scattering medium with PSF manipulation,” Sci Rep 8(1), 8 (2018). [CrossRef]  

23. H. C. Zhuang, H. X. He, X. S. Xie, and J. Y. Zhou, “High speed color imaging through scattering media with a large field of view,” Sci Rep 6(1), 7 (2016). [CrossRef]  

24. Y. Ma, W. Liu, Y. Cui, and X. Xiong, “Multiple-scattering effects of atmosphere aerosols on light-transmission measurements,” Opt. Rev. 24(4), 590–599 (2017). [CrossRef]  

25. A. Ishimaru, Wave Propagation and Scattering in Random Media (IEEE, 1978).

26. L. R. Bissonnette, “Imaging through fog and rain,” Opt. Eng. 31(5), 1045–1052 (1992). [CrossRef]  

27. J. W. Goodman, Speckle Phenomena in Optics: Theory and Applications (SPIE, 2020).

28. I. Freund, M. Rosenbluh, and S. Feng, “Memory effects in propagation of optical waves through disordered media,” Phys. Rev. Lett. 61(20), 2328–2331 (1988). [CrossRef]  

29. S. Feng, C. Kane, P. A. Lee, and A. Douglas Stone, “Correlations and fluctuations of coherent wave transmission through disordered media,” Phys. Rev. Lett. 61(7), 834–837 (1988). [CrossRef]  

30. K. Kulpa and Z. Czekala, ““Masking Effect and its Removal in PCL Radar,” IEE Proc., Radar Sonar Navig. 152(3), 174–178 (2005). [CrossRef]  

31. M. Naldi, “False alarm control and self-masking avoidance by a biparametric clutter map in a mixed interference environment,” IEE Proc., Radar Sonar Navig. 146(4), 195–200 (1999). [CrossRef]  

32. S. G. Narasimhan, S. K. Nayar, S. Ieee Computer, S. Ieee Computer, and S. Ieee Computer, “Shedding light on the weather,” in Conference on Computer Vision and Pattern Recognition, Proceedings - Ieee Computer Society Conference on Computer Vision and Pattern Recognition (Ieee Computer Soc, 2003), 665–672.

33. Z. S. Liu, Y. F. Yu, K. L. Zhang, and H. L. Huang, “Underwater image transmission and blurred image restoration,” Opt. Eng. 40(6), 1125–1131 (2001). [CrossRef]  

34. Y. R. Huang, Y. K. Liu, H. S. Liu, Y. Y. Shui, G. W. Zhao, J. H. Chu, G. H. Situ, Z. B. Li, J. Y. Zhou, and H. W. Liang, “Multi-View Optical Image Fusion and Reconstruction for Defogging without a Prior In-Plane,” Photonics 8(1), 10 (2021). [CrossRef]  

35. W. D. Michael and A. Mortimer, “Köhler Illumination,” Olympus LS, https://www.olympus-lifescience.com.cn/en/microscope-resource/primer/anatomy/kohler/.

Data availability

Data underlying the results presented in this Letter are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. PSF simulation based on Monte Carlo method.
Fig. 2.
Fig. 2. Simulated normalized (a) 1-D and (b) 2-D PSF.
Fig. 3.
Fig. 3. Results of simulation and experiment of target region. (a) Raw photo captured by the camera without scattering. (b) Simulated imaging in dense fog with 4 m visibility. (c) Experimental imaging in dense fog with 4 m visibility. (d) The cross-sectional intensity of corresponding line of interest.
Fig. 4.
Fig. 4. Simulation of object in dense fog under 4 m visibility. (a)-(d) Objects constructed by a certain size target “H” ($\textrm{I}$) with different size circular luminous areas ($\textrm{II}$). (e)-(h) Simulation result under 4 m visibility.
Fig. 5.
Fig. 5. Local contrast variation of target “H”.
Fig. 6.
Fig. 6. Simulation and experiment results of target imaging. (a) The jet fighter-model captured by the imaging system under high visibility. (b), (c) The simulation results under 8 m and 7 m visibility, respectively. (d) Local contrast of fuselage and bomb bay in simulation and experiment. (e), (f) The experiment results under 9 m and 8 m visibility, respectively.

Equations (4)

Equations on this page are rendered with MathJax. Learn more.

P S F = P S F b a l + P S F s c a .
I ( x i , y i ) = P S F ( x i , y i ; x o , y o ) × O ( x o , y o ) d x o d y o ,
I ( x i , y i ) = ( P S F b a l ( x i , y i ; x o , y o ) + P S F s c a ( x i , y i ; x o , y o ) ) × O ( x o , y o ) d x o d y o .
I ( x i , y i ) = D × O ( v L x o , v L y o ) + P S F s c a ( x i , y i ; x o , y o ) × O ( x o , y o ) d x o d y o = I s i g ( x i , y i ) + I n o i ( x i , y i ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.