Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Inline digital holographic movie based on a double-sideband filter

Open Access Open Access

Abstract

This Letter proposes a new optical architecture based on a double-sideband filter, simultaneously applied at the Fourier plane, for inline digital holography. The proposed architecture not only allows removal of the conjugate images in the reconstruction process but also reduces the distortions that usually appear when using a single-sideband filter. We first introduce the mathematical model that explains the method and then describe the optical setup used for the implementation. The optical system includes a parallel aligned liquid crystal display placed at the Fourier plane that simultaneously filters positive and negative frequencies, when properly combined with linear polarizers. This feature makes the device useful to register dynamic processes. Finally, we tested the setup by registering a holographic movie of microscopic moving objects placed at different planes.

© 2015 Optical Society of America

Holograms are intensity or phase distributions that arise from the interference between the light transmitted (or reflected) by an object with a given reference beam. In the reconstruction process, reconstruction beams coherently illuminate the holograms, and the optical field diffracted by the hologram is formed by three different field contributions: a direct wave (proportional to the reference beam), an object conjugate wave (real image of the object), and an optical field proportional to the object wave (virtual image of the object). Depending on the optical arrangement used for registration and reconstruction of holograms, the visualization of object virtual images may be degraded by defocused images related to the object conjugate wave.

Current advances achieved in optoelectronic devices, such as spatial light modulators and detectors (CCD, CMOS), make digital holography a technique widely used in a large number of applications [1,2]. In digital holography, the interference between the object and the reference wave fronts is recorded onto a pixelated optoelectronic sensor. Afterward, this recorded image is used for the reconstruction of the object wave by using numerical methods [3,4]. Two main optical arrangements are used for digital holography: off-axis (OA) holography [5] and inline (IL) holography [6]. In OA holography, proposed by Leith and Upatnieks, the reference and the object beams have a relative angle at the registration plane. In this way, the object, the conjugate, and the reconstruction waves propagate along different directions and can be observed separately in the reconstruction. Thus, the real image can be selected by spatially filtering the reconstructed planes [7,8].

However, OA configurations require a recording media with high spatial resolution, and they are more sensitive to vibrations and air flows than IL configurations. When dealing with IL configurations in the reconstruction process, the object virtual image (related to the object wave) and the object real image (related to the object conjugate image) overlap, so they usually cannot be observed separately.

Several techniques have been proposed to remove the unwanted conjugate image in hologram reconstruction. For instance, a spatial filtering of reconstruction planes of digital holograms was proposed in [9]. To achieve this goal, the desired digitally reconstructed image is isolated from its surrounding pixels, but the selected area still contains noise from the undesired conjugate image. Another method to remove the influence of conjugate images was proposed in [10], where the hologram was recorded in the far field of the object. In such a situation, when the hologram is reconstructed, the object image is focused but the conjugate image appears strongly defocused, which makes it appear as background noise.

Reference [11] proposes another interesting approach that is based on the phase-shifting technique. In this approach, a CCD camera captures a set of interferograms obtained by illuminating the object with different reference waves related to different phase shifts. These phase shifts are in general performed through small displacements of a mirror on a piezoelectric device. Although this is a very suitable technique to be applied in IL applications, due to the time-sequential acquisition required to capture the diverse interferograms, it is difficult to implement in dynamic processes. In addition, the configuration used in this technique is sensitive to vibrations and air flows.

An alternative technique, known as single-sideband (SSB) holography, was originally proposed by Bryngdahl and Lohmann [12]. Due to its simplicity, it also can be applied to dynamical objects. The method stops half of the spatial frequency spectrum at the Fourier plane during the recording of the hologram. Then the other sideband of the spectrum is blocked during the reconstruction step. The original technique was improved and adapted to digital holography [13,14] and has already been experimentally tested for fluid velocimetry [15]. To this end, a collimated beam illuminates the particles and their distribution is imaged onto a CCD using convergent lens. At the focal plane of the lens, half of the spectrum is blocked by a knife edge. Consequently, a slight deformation of the image is observed in the reconstruction process because not all of the information of the wavefront was registered.

In this work, we propose a digital IL holographic method that offers reconstructed images free of distortions and uncorrupted by the conjugate ones. The technique exploits the idea of compensating deformations originated by a single-sideband filter by simultaneously using double-sideband (DSB) filtering. After providing the DSB filtering proof of concept, a particular experimental setup, based on a liquid crystal on silicon (LCoS) display, was implemented and tested to track particles placed at different planes.

First we will describe the technique by analyzing light propagation through the IL configuration shown in Fig. 1.

 figure: Fig. 1.

Fig. 1. Inline configuration used to illustrate the technique based on sideband filters.

Download Full Size | PDF

Figure 1 shows two different particles placed at two different planes labeled as P1 and P2, respectively. The system is illuminated by a plane wave that is diffracted by the particles and the Fourier spectrum is obtained at the focal plane of the convergent lens L1 [i.e., solid lines in Fig. 1]. In addition, L1 is also imaging an intermediate plane Pint onto the CCD camera [dashed lines in Fig. 1]. Let us denote as Uo(x,y) the complex value of the electric field at the CCD camera if there were no filter at the Fourier plane.

By assuming that we are dealing with an almost transparent object we can write:

U0(x,y)=1+ΔU0(x,y),
where ΔU0 represents the contribution of the light diffracted by the particles. In this situation, you can calculate the amplitude distribution at the Fourier plane by the Fourier transforming Eq. (1), which leads to
U˜o(μ,ν)=δ(μ,ν)+ΔU˜o(μ,ν),
where δ denotes the Dirac delta function.

In turn, if a single-sideband filter is introduced at the Fourier plane to block frequencies μ<0, the amplitude distribution at the CCD plane becomes

UCCD+=12+0dμΔU˜o(μ,ν)exp[i2π(μx+νy)]dν,
and the corresponding intensity distribution registered by the CCD is
ICCD+=|UCCD+|214+120dμΔU˜o(μ,ν)ei2π(μx+νy)dνA++120dμΔU˜o*(μ,ν)ei2π(μx+νy)dνB+,
where the asterisk * denotes complex conjugation. Note that the fourth term of the modulus square can be neglected because of the assumption of quasi-transparent object.

Whereas the second term (A+) in Eq. (4) only contains frequencies μ0, the third term (B+) only contains frequencies μ0. Afterward, the Fourier transform of the intensity registered at the CCD [Eq. (4)] can be digitally calculated and frequencies μ0 removed, which allows us to eliminate the (B+) term (i.e., the unwanted conjugate term). In this situation Eq. (4) becomes

|UCCD+|2=14+120dμΔU˜o(μ,ν)ei2π(μx+νy)dν,
where the subscript (−) indicates that frequencies μ0 have been digitally removed. At this point, a second single-sideband filter is placed at the Fourier plane, but this time, it is intended to block frequencies μ>0.

By performing this action, we obtain an expression similar to Eq. (5):

|UCCD|+2=14+120dμΔU˜o(μ,ν)ei2π(μx+νy)dν,
where the subscript (+) indicates that frequencies μ0 have been digitally removed.

Finally, by adding Eqs. (5) and (6) we obtain

|UCCD+|2+|UCCD|+2=12+120dμΔU˜o(μ,ν)ei2π(μx+νy)dν+120dμΔU˜o(μ,ν)ei2π(μx+νy)dν=12+12ΔUo(x,y)=12Uo(x,y).

In this way the full complex amplitude, without the contribution of unwanted conjugate waves, is obtained. From this information, the wavefront can be reconstructed in any arbitrary position by using a Rayleigh–Sommerfeld diffraction integral equation [16].

To implement the DSB filtering stated above, we have proposed the optical architecture shown in Fig. 2(a). This setup allowed us to apply the two sideband filters simultaneously, which is a very interesting feature in dynamic applications. In this architecture a laser beam, spatially filtered and then collimated by the convergent lens L1, is used as light source. A linear polarizer LP1 is used to properly choose the polarization angle of the incident beam. After LP1, the object under study is placed, represented by P1 and P2 planes. The Fourier plane is placed at the back focal plane of a second convergent lens L2 [solid lines in Fig. 2(a)], that simultaneously images an intermediate plane Pint, which is situated between the P1 and P2 planes, over two different CCD cameras [dashed lines in Fig. 2(a)]. A beam-splitter (BS) is used to properly image Pint onto CCD1 and CCD2 planes.

 figure: Fig. 2.

Fig. 2. (a) Optical architecture proposed to implement the technique based on double-sideband filters and (b) sketch of the optical arrangement used to implement the double-sideband filter.

Download Full Size | PDF

Since one of the goals of the proposed method is to be used to record holograms of dynamic processes, the DSB filtering must be performed simultaneously. Some suitable devices to perform this operation include phase plates, ferroelectric crystals (FECs), and liquid crystal displays (LCDs). In this case, we used the optical device shown in Fig. 2(b), which is formed by a parallel aligned liquid crystal display (PA-LCD) in combination with linear polarizers.

Next we will describe the device performance. The linear polarizer LP1 was set to illuminate the PA-LCD with a linearly polarized beam at 45° of the laboratory vertical. The PA-LCD can be modeled as a linear variable phase plate oriented at 0° and whose retardance depends on the addressed voltage [17]. The voltage amplitude and the spatial distribution is selected in such a way that one half of the display has a δ1=0° retardance and the other half δ2=180° [see Fig. 2(b)]. Thus, whereas the incident linear polarization is not modified when it passes through the first PA-LCD half (δ1=0°), the light passing through the second half (δ2=180°) is rotated 90°, so its linear polarization becomes oriented at 135° [Fig. 2(b)]. Afterward, the wavefront divided at the BS is projected, respectively, onto the linear analyzers LP2 and LP3. These two analyzers are orthogonally oriented to each other, namely, at 45° and at 135°, respectively. In this configuration, the linear analyzers LP2 and LP3 act as upper and lower sideband filters and thus the intensity registered at the CCD1 and CCD2 cameras corresponds to those described in Eqs. (5) and (6), respectively. Therefore, these complementary intensity images are simultaneously achieved. By computing Eq. (7), the full complex amplitude without the contribution of unwanted conjugate waves is obtained in real time.

The optical architecture proposed in Fig. 2(a) was experimentally implemented as shown in Fig. 3. A linearly polarized laser (Research Electro-optics, Model R-30995, wavelength 633 nm, output power 17 mW) was used as the light source. The laser beam was expanded and filtered by a spatial filter. Lenses L1 (f=250mm, D=50mm) and L2 (f=300mm, D=50mm) were used to collimate the beam and to image a plane Pint onto CCDs, respectively. In the focal plane of L2, as spatial light modulator, we used a parallel aligned electrically controlled birefringence LCOS display distributed by HOLOEYE, so the setup is adapted to operate in reflection. The display, named PLUTO, is an active matrix reflective device with 1920×1080 pixels and 0.7 diagonal. The pixel pitch is of 8.0 μm and the display has a fill factor of 87%. We used CCD cameras (Basler, Model PIA 1000-60 gm), with a Truesense Imaging KAI-1020 CCD sensor, which delivers 60 frames per second at 1 MP resolution.

 figure: Fig. 3.

Fig. 3. Experimental setup.

Download Full Size | PDF

The proposed technique and setup were tested with an object conformed by a fixed micrometric reticle, placed at P1, and a thin glass with 100 μm microspheres randomly distributed on one of its faces. The thin glass is mounted on a rotation stage placed on P2 plane and in this way constitutes the dynamic object. The distance between both planes is 100 mm.

The plane Pint, set between the microspheres diffractive particles and the reticle object, it is located to 50 mm from P2 and to 690 mm (object distance, s) from L2. CCD1 and CCD2 image the plane Pint, simultaneously recording complementary holograms (Fig. 4). Note that the spots corresponding to defocused particles are deformed because of the single-sideband filter applied to obtain each hologram [upper Fig. 4(a) and lower Fig. 4(b), respectively].

 figure: Fig. 4.

Fig. 4. (a) Upper and (b) lower sideband holograms acquired by CCD cameras.

Download Full Size | PDF

The digital Fourier transform of both holograms is computed and the corresponding half frequency plane is blocked, as detailed in the mathematical fundamentals. After that, the inverse Fourier transform is applied to get the complex amplitudes. By adding both distributions, the compensated complex amplitude Uo(x,y) of the object is obtained.

 figure: Fig. 5.

Fig. 5. Reconstruction of focused image of (a) the micrometric reticle (z=30mm) and (b) microspheres (z=31mm).

Download Full Size | PDF

Next, the complex amplitude Uo(x,y) was digitally propagated through z-axis using the Rayleigh–Sommerfeld diffraction integral equation. In this case, z=0mm corresponds to reconstruction of Pint in the image plane. The reconstruction of the micrometric reticle image in Fig. 5(a) shows that its conjugate image is removed and the microspheres are defocused. Analogously, when the particles are focused the conjugate image is not visible and the reticle is now defocused [Fig. 5(b)]. Unlike the holograms shown in Fig. 4, the images displayed in Fig. 5 are free of distortions because of the double-sideband filter used.

 figure: Fig. 6.

Fig. 6. Reconstruction of a holographic movie of a dynamic object (see Visualization 1).

Download Full Size | PDF

The movie in (Fig. 6) shows the capability of the method and the setup to record holograms of dynamic objects. The first part illustrates only a refocusing process (i.e., for a fixed time a scanning from Pint until the microsphere plane, along the z axis is performed). Next, for a fixed z distance, the focused microspheres are spatially rotated in the xy plane. Finally, the refocusing process is applied to scan the z axis in the reverse direction but meanwhile the microspheres are in motion. The refocusing process ends when micrometric reticle is focused.

For a proper experimental implementation of the technique, an accurate alignment of the optical elements in the setup, with special attention to the two CCD cameras, must be provided. This is because simultaneous intensity images of the same object region with the same magnification must be recorded. This can be achieved by using a single convergent lens [L2 in Fig. 2(a)] properly placed before the beam-splitter element. To ensure that both cameras are properly set in the setup, we conducted some experimental calibration strategy (e.g., using an object test with some specific marks to guide the pixel-to-pixel alignment), which allowed us to place the CCDs at the best focal plane and imaging the same object region. Misalignments present between CCD cameras will to certain extent lead to a decrease of the final reconstructed hologram efficiency. In addition, faint differences between the sensors (that may occur in industrial fabrication processes) may also influence the results. In spite of this situation, as provided by the experimental tests conducted, performing an accurate alignment of the system makes the proposed setup robust enough to provide satisfactory results.

To summarize, this Letter presents what we believe is a new IL holographic technique based on double-sideband filtering. It allows the distortion-free removal of spurious conjugate images, and is suitable for capturing dynamic processes. After providing the mathematical foundations of the double-sideband based technique, we have presented an optical architecture, able to perform simultaneously the filtering for negative and positive frequencies. In addition, an experimental implementation is conducted, where the double-sideband filtering is achieved using a PA-LCD combined with properly oriented linear polarizers. Finally, the proposed method and the optical setup were tested with a dynamic object. A movie shows the ability of the technique to scan along the optical axis objects placed at different planes, an operation that is possible to perform for each frame of the movie. This feature makes the technique of interest to be used for tracking microscopic objects at frame rates limited only by the speed of the detection device.

Funding

ANPCYT PICT (2014-2432); Catalan Government (SGR 2014-1639); CONACyT (207633, 250850); Fondos FEDER; Spanish MINECO (FIS2012-39158-C02-01); UBACyT (20020130100727BA).

REFERENCES

1. T. Kreis, Handbook of Holographic Interferometry (Wiley-VCH Verlag GmbH, 2005), pp. 81–92.

2. B. M. Hennelly, D. P. Kelly, N. Pandey, and D. Monaghan, Proceedings of the China–Ireland Information and Communications Technologies Conference (National University of Ireland Maynooth, 2009), pp. 241–245.

3. U. Schnars and W. P. O. Jüptner, Meas. Sci. Technol. 13, R85 (2002). [CrossRef]  

4. S. Grilli, P. Ferrano, S. De Nicola, A. Finizio, G. Pierattini, and R. Meucci, Opt. Express 9, 294 (2001). [CrossRef]  

5. E. N. Leith and J. Upatnieks, J. Opt. Soc. Am. 55, 569 (1965). [CrossRef]  

6. D. Gabor, Nature 161, 777 (1948). [CrossRef]  

7. E. N. Leith and J. Upatnieks, J. Opt. Soc. Am. 52, 1123 (1962). [CrossRef]  

8. E. Cuche, P. Marquet, and C. Depeursinge, Appl. Opt. 39, 4070 (2000). [CrossRef]  

9. G. Pedrini, P. Fröning, H. Fessler, and H. J. Tiziani, Appl. Opt. 37, 6262 (1998). [CrossRef]  

10. J. B. DeVelis, G. B. Parrent Jr., and B. J. Thompson, J. Opt. Soc. Am. 56, 423 (1966). [CrossRef]  

11. I. Yamaguchi and T. Zhang, Opt. Lett. 22, 1268 (1997). [CrossRef]  

12. O. Bryngdahl and A. Lohmann, J. Opt. Soc. Am. 58, 620 (1968). [CrossRef]  

13. T. Mishina, F. Okano, and I. Yuyama, Appl. Opt. 38, 3703 (1999). [CrossRef]  

14. Y. Takaki and Y. Tanemoto, Appl. Opt. 48, H64 (2009). [CrossRef]  

15. V. Palero, J. Lobera, N. Andres, and M. P. Arroyo, Opt. Lett. 39, 3356 (2014). [CrossRef]  

16. G. Pedrini, W. Osten, and Y. Zhang, Opt. Lett. 30, 833 (2005). [CrossRef]  

17. A. Lizana, A. Marquez, L. Lobato, Y. Rodange, I. Moreno, C. Iemmi, and J. Campos, Opt. Express 18, 10581 (2010). [CrossRef]  

Supplementary Material (1)

NameDescription
Visualization 1: AVI (14602 KB)      Reconstruction of a holographic movie of a dynamic object.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. Inline configuration used to illustrate the technique based on sideband filters.
Fig. 2.
Fig. 2. (a) Optical architecture proposed to implement the technique based on double-sideband filters and (b) sketch of the optical arrangement used to implement the double-sideband filter.
Fig. 3.
Fig. 3. Experimental setup.
Fig. 4.
Fig. 4. (a) Upper and (b) lower sideband holograms acquired by CCD cameras.
Fig. 5.
Fig. 5. Reconstruction of focused image of (a) the micrometric reticle ( z = 30 mm ) and (b) microspheres ( z = 31 mm ).
Fig. 6.
Fig. 6. Reconstruction of a holographic movie of a dynamic object (see Visualization 1).

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

U 0 ( x , y ) = 1 + Δ U 0 ( x , y ) ,
U ˜ o ( μ , ν ) = δ ( μ , ν ) + Δ U ˜ o ( μ , ν ) ,
U CCD + = 1 2 + 0 d μ Δ U ˜ o ( μ , ν ) exp [ i 2 π ( μ x + ν y ) ] d ν ,
I CCD + = | U CCD + | 2 1 4 + 1 2 0 d μ Δ U ˜ o ( μ , ν ) e i 2 π ( μ x + ν y ) d ν A + + 1 2 0 d μ Δ U ˜ o * ( μ , ν ) e i 2 π ( μ x + ν y ) d ν B + ,
| U CCD + | 2 = 1 4 + 1 2 0 d μ Δ U ˜ o ( μ , ν ) e i 2 π ( μ x + ν y ) d ν ,
| U CCD | + 2 = 1 4 + 1 2 0 d μ Δ U ˜ o ( μ , ν ) e i 2 π ( μ x + ν y ) d ν ,
| U CCD + | 2 + | U CCD | + 2 = 1 2 + 1 2 0 d μ Δ U ˜ o ( μ , ν ) e i 2 π ( μ x + ν y ) d ν + 1 2 0 d μ Δ U ˜ o ( μ , ν ) e i 2 π ( μ x + ν y ) d ν = 1 2 + 1 2 Δ U o ( x , y ) = 1 2 U o ( x , y ) .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.