Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Phase retrieval using hologram transformation with U-Net in digital holography

Open Access Open Access

Abstract

Digital holography is a method of recording light waves emitted from an object as holograms and then reconstructing the holograms using light wave propagation calculations to observe the object in three dimensions. However, a problem with digital holography is that unwanted images, such as conjugate images, are superimposed as the hologram is reconstructed to create an observed image. In particular, the superimposition of conjugate light on the observed image is caused by the imaging device’s ability to record just the intensity distribution of light rather than the phase distribution of light. In digital holography, it has been shown that unwanted light can be eliminated by the phase-shift method. However, it is difficult to apply the phase-shift method to digital holographic microscopy (DHM), which takes only one shot of light intensity. Alternatively, machine learning methods called deep learning have been actively studied in recent years for image-related problems, with image transformation as an example. Furthermore, a method that combines digital holography and deep learning has been proposed to perform image transformation to remove conjugate images using deep learning on the reconstructed image of a hologram. In this study, we generated a pair of holograms with only light intensity distribution and holograms with complex amplitude by simulating light wave propagation, trained U-Net to perform image transformation that adds phase information to the hologram with only light intensity distribution, and proposed a method for phase retrieval and conjugate image removal for holograms using the learned U-Net. To verify the effectiveness of the proposed method, we evaluated the image quality of the reconstructed image of holograms before and after processing by U-Net. Results showed that the peak signal-to-noise ratio (PSNR) increased by 8.37 [dB] in amplitude and 9.06 [dB] in phase. The amplitude and phase of the structural similarity index (SSIM) increased by 0.0566 and 0.0143, respectively. Furthermore, the results of applying the proposed method to holograms captured by actual digital holography optics showed the effectiveness of the proposed method in eliminating conjugate images in the reconstructed images. These results show that the proposed method is capable of phase retrieval of holograms in a single shot without the need for a complex optical system. This is expected to contribute to the field of portable DHMs and other applications that require compact and simple optical systems.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Digital holography [1] is a method that records light waves emitted from an object as holograms and reconstructs the holograms [2] using light wave propagation calculations to observe the object in three dimensions. Since digital holography can simultaneously obtain amplitude and phase information of a sample in a single shot, as well as observe the three dimension (3D) shape of transparent objects (phase objects) and moving objects in a non-contact and noninvasive manner, it is expected to be applied in fields such as bio-imaging.

However, when the hologram is reconstructed to obtain the observed image, undesired images such as direct light, out-of-focus images, and conjugate images are superimposed on top of the desired image created by the object light. This is a problem concerned with digital holography. In particular, the superimposition of conjugate light on the observed image is caused by the image sensor’s ability to record only the intensity distribution of light and not the phase distribution of light. Therefore, many methods have been proposed to solve the problem of conjugate images. For example, an off-axis optical system [3] is used to separate object light from other unwanted light. Although this method can reconstruct the wavefront of the object light from a single hologram, the range of object light that can be recorded is narrowed, which is a problem. Another method is to record multiple reference lights with different phase-shift amounts [46] or multiple holograms in different planes [79] and obtain the phase distribution of light from them, but this method does not apply to moving objects because it records holograms multiple times. There is another method called OSH (optical scanning hologram) [10,11], which uses a single-pixel sensor and a scanning mechanism to obtain phase-shifted holograms, but this method also involves multiple measurements and is not applicable to moving objects. To solve this problem, the parallel phase-shift method [12] and other methods have been proposed, although they require special imaging equipment.

However, in recent years, machine learning methods called deep learning have been actively studied for image-related problems. Object detection [13], colorization of monochrome images [14], super-resolution [15], image generation [16,17], and image transformation [18] are a few examples. Several studies using deep learning have been reported in the field of holography.

Wang [19] et al., Ren [20], and Shimobaba [21] proposed using convolution neural Network (CNN) to automatically calculate the propagation distance during hologram reconstruction. Wu [22] et al. further extended the depth of field of the reconstructed observed image, while Ren [23] et al., Wang [24], and Wang [25] proposed using deep learning for hologram reconstruction. Furthermore, Shimobaba [26] et al. successfully reconstructed the distribution of particles recorded in a hologram; Zhang [27] et al. and Rivenson [28] proposed using deep learning for phase retrieval of holograms. In Refs. [27,28], image transformation using deep learning is performed on the reconstructed image to remove conjugate images and perform phase retrieval. However, phase retrieval is necessary because the image sensor can only record the intensity distribution of light. Therefore, it is also possible to remove the conjugate image and perform phase retrieval by assigning the phase distribution of light to the intensity distribution of light obtained as a hologram using deep learning.

In this study, we aimed to obtain conjugate-free reconstructed images in digital holography with a simple optical system that does not have a special imaging device. And, pairs of holograms with only the intensity distribution of light and holograms with complex amplitude are generated by simulating the light wave propagation, and the image transformation between the pairs of holograms is learned by U-Net [29]. Furthermore, we will investigate whether phase recovery and conjugate image removal of the holograms are possible using the learned U-Net.

2. Principle

2.1 Digital holography

Digital holography is a method of recording light waves emitted from an object as interference fringes (holograms) with a reference light. The recorded hologram is reconstructed by calculating light wave propagation. Figure 1 shows an optical system of the in-line transmission type (Gabor-type), a common recording optical system for digital holography.

 figure: Fig. 1.

Fig. 1. In-line transmission (Gabor-type) optical system

Download Full Size | PDF

In the Gabor-type optical system, the light emitted from the laser source that is diffracted by the object to be observed becomes the object light, and the light that is not diffracted by the object to be observed becomes the reference light. If the reference light is R and the object light is O, the intensity distribution I of the light on the hologram surface of the image sensor is given by Eq. (1).

$$I = {|{O + R} |^2} = {|O |^2} + {|R |^2} + O{R^{\ast }} + {O^{\ast }}R$$

The light intensity distribution I obtained here is the hologram created by the optical system. The direct light components are represented by the first and second terms in Eq. (1). The third term is the object light component. The fourth term is the conjugate light component. The image produced by the conjugate light component is called the conjugate image, which is superimposed on the image constructed by the object light, thereby deteriorating the quality of the reconstructed image of the hologram.

Consider the case where not only the intensity distribution I of the light on the hologram surface but also the complex amplitude distribution C of the light is obtained. The complex amplitude distribution C of light on the hologram surface is then given in Eq. (2).

$$C = O + R$$

The complex amplitude distribution C of light on the hologram surface is the sum of the object light O and the reference light R. Since there is no conjugate light component, the conjugate image is no longer superimposed on the reconstructed image of the hologram.

2.2 Neural network architecture

In this study, U-Net was used as the neural network architecture as shown in Fig. 2. U-Net is a neural network architecture proposed for image segmentation and widely used as an image processing model. Each layer of the encoder-decoder system gets a skip connection, allowing the features obtained at each layer to be transferred from the encoder to the decoder. This enables image processing while preserving the features of the input image.

 figure: Fig. 2.

Fig. 2. U-Net convolutional neural network model

Download Full Size | PDF

In the training phase of image processing, this CNN is given a dataset of pairs of images before and after processing. This CNN iteratively updates the weights and biases of the network using an adaptive moment estimation (Adam) optimizer [30] to minimize the L1 norm distance between the post-processed image and the image output by the network from the pre-processed image. This trains the image processing to the CNN.

3. Dataset generation

In this section, we describe how to generate datasets for training the U-Net. This study aims to achieve phase retrieval through image transformation from the intensity distribution I of light on the hologram surface described in Section 2.1 to the complex amplitude distribution C of light.

3.1 Treatment of complex amplitude on U-Net

Since the input and output of U-Net are red, green, blue (RGB) images, complex amplitudes cannot be handled as they are. Therefore, to avoid the effect of phase wrapping, the amplitude is treated as one channel and the phase as two channels of cos (phase) and sin (phase). When they are combined, the complex amplitude is treated as a three-channel RGB image, as shown in Fig. 3.

 figure: Fig. 3.

Fig. 3. Data format for complex amplitude

Download Full Size | PDF

3.2 Generating holograms for the dataset

This section describes the method for generating the holograms that serve as the dataset. In this study, the amplitude and phase components of the observed object’s complex amplitude on the object plane were generated by taking the logical product of a randomly generated mask image and an image that serves as the texture (Fig. 4(a)). From these complex amplitudes on the object plane, the propagation of light was calculated to generate the complex amplitude of light on the hologram plane. The holograms were further divided into parts large enough to be imputed into the U-Net, and holograms of only light intensity and holograms having complex amplitude were combined to form a dataset (Fig. 4(b)). The holograms record the light from the observed object, which is blurred and spread out by diffraction. As a result, parameters such as the light propagation distance have a greater impact on the pattern of the hologram than on the pattern of the observed object. Therefore, it is believed that phase retrieval is possible even for a dataset generated solely through simulation without using the information on the observed object obtained from optical experiments.

 figure: Fig. 4.

Fig. 4. (a) Procedure for generating the complex amplitude of the observed object on the object plane. (b) Procedure for generating the hologram for the dataset from the complex amplitude of the observed object on the object plane

Download Full Size | PDF

3.3 Sub-hologram

The input of the U-Net used in this study is a 512 × 512 pixels RGB image. To perform phase retrieval of holograms larger than this, the hologram is divided into 512 × 512 pixels sub-holograms and input to the U-Net, as shown in Fig. 5. After the image transformation, small areas of 256 × 256 pixels are extracted from the output of the U-Net. Since the boundary of the sub-hologram contains incomplete interference patterns, the prediction of the boundary area contains unreliable results. Therefore, by extracting small regions, the sub-holograms can be connected more smoothly.

 figure: Fig. 5.

Fig. 5. (a) Sub-hologram. (b) Phase retrieval of large holograms by laying out sub-holograms

Download Full Size | PDF

4. Experiment

Table 1 summarizes the experimental conditions.

Tables Icon

Table 1. Experimental conditions

The “Batch size” in Table 1 is the number of input data to be input to the U-Net at one time. The samples in the dataset were generated by simulating the method shown in section 3.2 and divided into 512 × 512 [pixels] sub-holograms. In the test data set, each of the 11 large holograms was divided into 108 sub-holograms for phase retrieval.

4.1 Phase retrieval of test dataset

The amplitude and phase distributions of the reconstructed image of the amplitude-only hologram are shown in Fig. 6(a) and 6(b), respectively. The amplitude and phase distributions of the reconstructed image of a hologram with phase retrieval in the learned U-Net are shown in Fig. 6(c) and 6(d), respectively. The amplitude and phase distributions of the reconstructed image of a complex amplitude hologram are shown in Fig. 6(e) and 6(f), respectively. In this section, we compare the image quality when the phase retrieval is performed using the proposed method (Fig. 6(c) and 6(d)) with that without phase recovery (Fig. 6(a) and 6(b)), based on the ground truth (Fig. 6(e) and 6(f)). The effectiveness of the proposed method will be verified by comparing the image quality of the two cases.

 figure: Fig. 6.

Fig. 6. (a) Amplitude distribution of the reconstructed image of an amplitude-only hologram. (b) Phase distribution of the reconstructed image of an amplitude-only hologram. (c) Amplitude distribution of the reconstructed image of a hologram with phase retrieval on a learned U-Net (proposed method). (d) Phase distribution of the reconstructed image of a hologram with phase retrieval on a learned U-Net (proposed method). (e) Amplitude distribution of the reconstructed image of a complex amplitude hologram (ground truth). (f) Amplitude distribution of the reconstructed image of a complex amplitude hologram (ground truth)

Download Full Size | PDF

The results of the image quality evaluation using PSNR and SSIM [31] as image quality evaluation indices are shown in Table 2.

Tables Icon

Table 2. Image quality evaluation of the reconstructed images

Figure 6(a) shows a blurred image surrounding the object. The blurred image is the conjugate image. Since this blurred image is not visible in Fig. 6(c), we can confirm that the proposed method affects the phase retrieval of the hologram. Comparing the cases without and with phase retrieval by the proposed method, the proposed method increased the PSNR by 7.68 [dB] in amplitude and 10.77 [dB] in phase. Furthermore, the SSIM increased by 0.0367 [dB] in amplitude and by 0.0066 [dB] in phase. The image quality of the reconstructed images was evaluated using ten additional sets of similar test datasets. Results showed that, on average, the PSNR increased by 8.37 [dB] in amplitude and 9.06 [dB] in phase. Furthermore, in SSIM, there was a 0.0566 increase in amplitude and 0.0143 in phase. These results confirm the effectiveness of the proposed method in phase retrieval of the holograms.

4.2 Phase retrieval of holograms taken with the actual optical system

In this section, phase retrieval is performed on holograms taken with the actual optical system. The actual optical system used to take the hologram is shown in Fig. 7. The experimental conditions of the optical system (hologram resolution, hologram sampling pitch, laser wavelength, and propagation distance $Z$) are the same as those in Table 1.

 figure: Fig. 7.

Fig. 7. The actual optical system used to take holograms

Download Full Size | PDF

The results are shown in Fig. 8. The amplitude and phase distributions of the reconstructed image of the amplitude-only hologram are shown in Fig. 8(a) and 8(b). Figure 8(c) and 8(d) shows the amplitude and phase distributions of a reconstructed image of a hologram with phase retrieval using the proposed method. For comparison, Fig. 8(e) and 8(f) illustrates the amplitude and phase distributions of the regenerated image of a hologram with phase retrieval using the iterative optimization algorithm [79] with multiple holograms.

 figure: Fig. 8.

Fig. 8. (a) Amplitude distribution of the reconstructed image of the amplitude-only hologram. (b) Phase distribution of the reconstructed image of the amplitude-only hologram. (c) Amplitude distribution of the reconstructed image of a hologram with phase retrieval on a learned U-Net (proposed method). (d) Phase distribution of the reconstructed image of a hologram with phase retrieval on a learned U-Net (proposed method). (e) Amplitude distribution of the reconstructed image of the holograms with phase retrieval by iterative optimization algorithm with multiple holograms. (f) Phase distribution of the reconstructed image of the holograms with phase retrieval by iterative optimization algorithm with multiple holograms

Download Full Size | PDF

As in Fig. 6(a), Fig. 8(a) shows a blurred image surrounding the object. This blurred image is removed in Fig. 8(c) and 8(e), where phase retrieval was attempted, confirming the removal effect of the conjugate image by phase retrieval for holograms captured by the optical system. In Fig. 7(b), conjugate images are observed surrounding the object in the phase distribution. However, in Fig. 8(d) and 8(f), where phase retrieval was attempted, conjugate images are removed. These results indicate that phase retrieval is effective in removing the conjugate image in the reconstructed image. In the iterative optimization algorithm with multiple holograms, multiple holograms are needed to perform phase retrieval. On the other hand, the proposed method can perform phase retrieval with only one hologram, which is an advantage of the proposed method.

5. Conclusion

We have proposed a method of phase retrieval and conjugate image removal for digital holography by adding phase information to light intensity-only holograms using U-Net. To verify the effectiveness of the proposed method, we evaluated the image quality of the reconstructed holograms before and after processing by U-Net on a test dataset. Results showed that the PSNR increased by 8.37 [dB] in amplitude and by 9.06 [dB] in phase. Furthermore, in SSIM, there was a 0.0566 increment in amplitude and 0.0143 in phase.

Furthermore, the application of the proposed method to holograms taken with actual digital holography optics showed the effectiveness of conjugate image removal in the reconstructed images. For comparison, phase retrieval using an iterative optimization algorithm with multiple holograms was also performed, and it demonstrated almost the same conjugate image removal effect as the proposed method. However, the iterative optimization algorithm with multiple holograms requires multiple holograms to perform phase retrieval. On the other hand, the proposed method can perform phase retrieval with only one hologram, which is an advantage of the proposed method.

These results show that the proposed method can perform phase retrieval of holograms in a single shot without the need for a complicated optical system. This method is expected to contribute to the field of portable digital holographic microscopy, where a compact and simple optical system is required.

Funding

Japan Society for the Promotion of Science (21K17760).

Acknowledgments

The authors would like to thank Maruzen-Yushodo Co., Ltd. (https://kw.maruzen.co.jp/kousei-honyaku/) for the English language editing.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time, but may be obtained from the authors upon reasonable request.

References

1. D. Gabor, “A new microscopic principle,” Nature 161(4098), 777–778 (1948). [CrossRef]  

2. M. A. Kronrod, N. S. Merzlyakov, and L. P. Yaroslavski, “Reconstruction of a hologram with a computer,” Sov. Phys. Tech. Phys. 17, 333–334 (1972).

3. U. Schnars and W. Jüptner, “Direct recording of holograms by a CCD target and numerical reconstruction,” Appl. Opt. 33(2), 179–181 (1994). [CrossRef]  

4. I. Yamaguchi and T. Zhang, “Phase-shifting digital holography,” Opt. Lett. 22(16), 1268–1270 (1997). [CrossRef]  

5. X. F. Meng, L. Z. Cai, X. F. Xu, X. L. Yang, X. X. Shen, G. Y. Dong, and Y. R. Wang, “Two-step phase-shifting interferometry and its application in image encryption,” Opt. Lett. 31(10), 1414–1416 (2006). [CrossRef]  

6. J. P. Liu and T. C. Poon, “Two-step-only quadrature phase-shifting digital holography,” Opt. Lett. 34(3), 250–252 (2009). [CrossRef]  

7. Y. Zhang and X. Zhang, “Reconstruction of a complex object from two in-line holograms,” Opt. Express 11(6), 572–578 (2003). [CrossRef]  

8. Y. Zhang, G. Pedrini, W. Osten, and H. J. Tiziani, “Reconstruction of in-line digital holograms from two intensity measurements,” Opt. Lett. 29(15), 1787–1789 (2004). [CrossRef]  

9. G. Situ, J. P. Ryle, U. Gopinathan, and J. T. Sheridan, “Generalized in-line digital holographic technique based on intensity measurements at two different planes,” Appl. Opt. 47(5), 711–717 (2008). [CrossRef]  

10. N. Yoneda, Y. Saita, and T. Nomura, “Motionless optical scanning holography,” Opt. Lett. 45(12), 3184–3187 (2020). [CrossRef]  

11. S. Jiao, P. W. Tsang, T. C. Poon, J. P. Liu, W. Zou, and X. Li, “Enhanced autofocusing in optical scanning holography based on hologram decomposition,” IEEE Trans. Ind. Inf. 13(5), 2455–2463 (2017). [CrossRef]  

12. Y. Awatsuji, M. Sasada, and T. Kubota, “Parallel quasi-phase-shifting digital holography,” Appl. Phys. Lett. 85(6), 1069–1071 (2004). [CrossRef]  

13. W. Liu, D. Anguelov, D. Erhan, C. Szegedy, S. Reed, C. Y. Fu, and A. C. Berg, “SSD: single shot multibox detector,” arXiv:1512.02325v5 (2016).

14. S. Iizuka, E. Simo-Serra, and H. Ishikawa, “Let there be color!: joint end-to-end learning of global and local image priors for automatic image colorization with simultaneous classification,” ACM Trans. Grap. (Proc. of SIGGRAPH 2016), 35, 4, 110:1-110:11(2016).

15. C. Dong, C. C. Loy, K. He, and X. Tang, “Image super-resolution using deep convolutional networks,” IEEE Trans. Pattern Anal. Mach. Intell.38(2), 295–307, arXiv:1501.00092v3 (2016).

16. T. Karras, S. Laine, and T. Aila, “A style-based generator architecture for generative adversarial networks,” IEEE Trans. Pattern Anal. Mach. Intell. 43(12), 4217–4228 (2021). [CrossRef]  

17. I. J. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative adversarial networks,” arXiv:1406.2661v1 (2014).

18. P. Isola, J. Zhu, T. Zhou, and A. A. Efros, “Image-to-image translation with conditional adversarial networks,” arXiv:1611.07004v3 (2016).

19. Z. Ren, Z. Xu, and E. Y. Lam, “Learning-based nonparametric autofocusing for digital holography,” Optica 5(4), 337–344 (2018). [CrossRef]  

20. Z. Ren, Z. Xu, and E. Y. Lam, “Autofocusing in digital holography using deep learning,” in Three-Dimensional and Multidimensional Microscopy: Image Acquisition and Processing XXV (International Society for Optics and Photonics, 2018), Vol. 10499, p. 104991 V (2018).

21. T. Shimobaba, T. Kakue, and T. Ito, “Convolutional neural network-based regression for depth prediction in digital holography,” 2018 IEEE 27th International Symposium on Industrial Electronics (ISIE), 2018, pp. 1323–1326 (2018).

22. Y. Wu, Y. Rivenson, Y. Zhang, Z. Wei, H. Günaydin, X. Lin, and A. Ozcan, “Extended depth-of-field in holographic imaging using deep-learning-based autofocusing and phase recovery,” Optica 5(6), 704–710 (2018). [CrossRef]  

23. Z. Ren, Z. Xu, and E. Y. M. Lam, “End-to-end deep learning framework for digital holographic reconstruction,” Adv. Photonics 1(01), 1 (2019). [CrossRef]  

24. H. Wang, M. Lyu, and G. Situ, “eHoloNet: a learning-based end-to-end approach for in-line digital holographic reconstruction,” Opt. Express 26(18), 22603–22614 (2018). [CrossRef]  

25. K. Wang, J. Dou, Q. Kemao, J. Di, and J. Zhao, “Y-Net: a one-to-two deep learning framework for digital holographic reconstruction,” Opt. Lett. 44(19), 4765–4768 (2019). [CrossRef]  

26. T. Shimobaba, T. Takahashi, Y. Yamamoto, Y. Endo, A. Shiraki, T. Nishitsuji, N. Hoshikawa, T. Kakue, and T. Ito, “Digital holographic particle volume reconstruction using a deep neural network,” Appl. Opt. 58(8), 1900–1906 (2019). [CrossRef]  

27. G. Zhang, T. Guan, Z. Shen, X. Wang, T. Hu, D. Wang, Y. He, and N. Xie, “Fast phase retrieval in off-axis digital holographic microscopy through deeplearning,” Opt. Express 26(15), 19388–19405 (2018). [CrossRef]  

28. Y. Rivenson, Y. Zhang, H. Günaydın, D. Teng, and A. Ozcan, “Phase recovery and holographic image reconstruction using deep learning in neural networks,” Light: Sci. Appl. 7(2), 17141 (2018). [CrossRef]  

29. O. Ronneberger, P. Fischer, and T. Brox, “U-Net: convolutional networks for biomedical Image Segmentation,” arXiv:1505.04597v1 (2015).

30. D. Kingma and J. Ba, “Adam: a method for stochastic optimization,” International Conference on Learning Representations (ICLR), 2015 pp. 1-15, (2015).

31. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. on Image Process. 13(4), 600–612 (2004). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time, but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1.
Fig. 1. In-line transmission (Gabor-type) optical system
Fig. 2.
Fig. 2. U-Net convolutional neural network model
Fig. 3.
Fig. 3. Data format for complex amplitude
Fig. 4.
Fig. 4. (a) Procedure for generating the complex amplitude of the observed object on the object plane. (b) Procedure for generating the hologram for the dataset from the complex amplitude of the observed object on the object plane
Fig. 5.
Fig. 5. (a) Sub-hologram. (b) Phase retrieval of large holograms by laying out sub-holograms
Fig. 6.
Fig. 6. (a) Amplitude distribution of the reconstructed image of an amplitude-only hologram. (b) Phase distribution of the reconstructed image of an amplitude-only hologram. (c) Amplitude distribution of the reconstructed image of a hologram with phase retrieval on a learned U-Net (proposed method). (d) Phase distribution of the reconstructed image of a hologram with phase retrieval on a learned U-Net (proposed method). (e) Amplitude distribution of the reconstructed image of a complex amplitude hologram (ground truth). (f) Amplitude distribution of the reconstructed image of a complex amplitude hologram (ground truth)
Fig. 7.
Fig. 7. The actual optical system used to take holograms
Fig. 8.
Fig. 8. (a) Amplitude distribution of the reconstructed image of the amplitude-only hologram. (b) Phase distribution of the reconstructed image of the amplitude-only hologram. (c) Amplitude distribution of the reconstructed image of a hologram with phase retrieval on a learned U-Net (proposed method). (d) Phase distribution of the reconstructed image of a hologram with phase retrieval on a learned U-Net (proposed method). (e) Amplitude distribution of the reconstructed image of the holograms with phase retrieval by iterative optimization algorithm with multiple holograms. (f) Phase distribution of the reconstructed image of the holograms with phase retrieval by iterative optimization algorithm with multiple holograms

Tables (2)

Tables Icon

Table 1. Experimental conditions

Tables Icon

Table 2. Image quality evaluation of the reconstructed images

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

I = | O + R | 2 = | O | 2 + | R | 2 + O R + O R
C = O + R
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.