Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Near-field ptychographic microscope for quantitative phase imaging

Open Access Open Access

Abstract

Quantitative phase imaging (QPI) is the name given to a set of microscopy techniques that map out variations in optical path lengths across a sample. These maps are a useful source of contrast for transparent samples such as biological cells, and because they are quantitative they can be used to measure refractive index and thickness variations. Here we detail the setup and operation of a new form of QPI microscope based on near-field ptychography. We test our system using a range of phase objects, and analyse the phase images it produces. Our results show that accurate, high quality images can be obtained from a ptychographical dataset containing as few as four near-field diffraction patterns. We also assess how our system copes with optically thick samples and samples with a wide range of spatial frequencies – two areas where conventional and Fourier ptychography struggle.

Published by The Optical Society under the terms of the Creative Commons Attribution 4.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.

1. Introduction

Many samples observed under a microscope have low intrinsic contrast. They do not absorb or scatter light sufficiently for direct observation. Biological cells are a key example, since they are almost invisible unless stained or otherwise enhanced using florescent dyes. One way to generate extrinsic contrast from these near-transparent samples is Quantitative Phase Imaging (QPI), which accurately maps out the relatively large variations in optical path length a sample imparts to a beam of illumination. These quantitative phase maps are especially useful because optical path length straightforwardly relates to several important sample properties, for example thickness and refractive index variations, or, for cells, the cell dry mass [1].

There are several way to produce quantitative phase images [2]. Examples include digital holography [3], ‘SLIM’ [4], lateral shearing interferometry [5], through focal series [6] and ptychography, in its original [7] or its Fourier [8] form. The ptychography techniques have shown great promise in obtaining high-accuracy, low noise phase images with easily implemented experimental setups, but data collection is time consuming, especially when covering a large field of view at high resolution, and applicability is limited to optically thin specimens [9, 10]. Although it can be done more quickly than the conventional approach [11], Fourier ptychography is also prone to low spatial frequency artefacts [12]. Here, we adapt near-field ptychography from the X-ray regime [13, 14] for implementation on an optical microscope and assess the accuracy and noise levels of the resulting phase images. We show that a large field-of-view can be reconstructed from as few as four near-field diffraction patterns, recorded at different lateral offsets of the sample, and that our approach is suited to optically thin or thick samples. This makes our method quick in terms of both data collection and image reconstruction time. Our results also serve as a proof of principle for an analogous implementation on an electron microscope, where most objects are optically thick.

2. Experimental setup

The concept of our near-field ptychographic microscope is simple, and has the advantage that it can be added on to a standard optical microscope. With reference to Fig. 1, it works as follows: a collimated laser illuminates more than the required field-of-view of a sample; a standard microscope forms a magnified image of the sample at the plane of a weak diffuser; the exit wave from the diffuser propagates a short distance onto a detector, which records the resulting near-field diffraction pattern; further patterns are recorded at a series of small lateral offsets of the sample; finally, iterative algorithms process the diffraction data to produce the quantitative phase map.

 figure: Fig. 1

Fig. 1 The experimental setup of our near-field ptychographic microscope. An expanded 675nm laser beam illuminates the sample, which is mounted on a mechanical x–y stage and moves independently to the rest of the components. A weak diffuser of transparent adhesive tape is placed in the image plane of a standard optical microscope and a CCD placed 5cm downstream captures the resulting near-field diffraction patterns. A secondary CCD aids sample focussing.

Download Full Size | PDF

Similar setups to that shown in Fig. 1 are in use currently, for example for live cell monitoring [15], imaging of anisotropic materials [16], and electron phase imaging [17]. But these examples operate in the far-field, or in the far near-field (i.e. at relatively low Fresnel numbers), and use an aperture where we use a diffuser. Using an aperture curtails the field-of-view covered by each diffraction pattern, and results in dark- and bright-field areas in the diffraction patterns that are difficult to record simultaneously in a single exposure. By using a diffuser and operating in the near-field, we can cover an equivalent field-of-view using far fewer diffraction pattern recordings, with each diffraction pattern having a narrower, more easily captured dynamic range.

In our microscope, the illumination is formed by a collimated 675nm fibre-coupled laser and the sample is mounted on a Newport XPS-Q4 x-y-z motorised stage. An interchangeable objective lens and a fixed 18cm tube lens form the magnified image of the sample at the plane of the diffuser, which in our case is a piece of transparent adhesive tape. A Thorlabs 4070MUSB CCD (2048 × 2048 pixels on a 7.4μm pitch), is positioned 5cm downstream of the diffuser to capture the near-field diffraction patterns (giving a Fresnel number of 6.85 × 103). We bin the CCD by a factor of two and data is captured with a single 200μs exposure. An example diffraction pattern from our system is shown in Fig. 2(a).

 figure: Fig. 2

Fig. 2 (a) An exemplar diffraction pattern observed using our system when a sample of red blood cells was mounted in the microscope. (b) A typical pseudo-random spiral pattern of x/y stage positions used in the collection of ptychographic data. Data is collected starting in the centre and moving out following the dotted line, the four red positions represent the minimum data collection needed to realise reconstructed images of reasonable quality.

Download Full Size | PDF

A data set is captured from our setup by first focussing the sample image at the correct plane, which we accomplish using a secondary focussing CCD camera on a separate path created by a beam splitter. We then record diffraction data over a series of sample x/y positions, which, to avoid possible reconstruction artefacts, should not form a regular grid. Several such positioning strategies have been tested for ptychography, see for example [18], but we have found that a spiral set of positions works well [19]. For the results below, we used positions such as those shown in Fig. 2(b), where small random offsets were added to a regular spiral to give a grid with an average separation of 15.8μm and maximum and minimum closest separations of (max: 23.2μm, min: 7.94μm). The exact specification of the scan pattern is not in fact critical, and we have found our system to work well over a large range of x/y step sizes, grid patterns, and diffuser-camera distances (camera lengths).

3. Image reconstruction

Once diffraction patterns are collected, we use the ePIE [20] algorithm with position correction [21] and background removal [22] to reconstruct the complex-valued image of the sample. As with other types of ptychography, the structure of the diffuser – which equates to the ‘probe’ in conventional ptychography – is recovered simultaneously with the sample. Importantly, in this configuration the recovered probe includes conjugate inhomogeneities in the illumination, as well as the structure of the diffuser; as a result, our final images appear as though perfectly illuminated by a plane wave.

We initialise ePIE with rough estimates (typically matrices of ones) for the magnified sample image incident on the diffuser, s(x, y), and the diffuser transmitivity p(x, y), where (x, y) denotes discrete coordinates in the microscope image plane on the 14.8μm pitch of the binned CCD. An estimate of the scattered wavefront leaving the diffuser, ψi(x, y), during recording of the ith diffraction pattern is generated from these sample and diffuser estimates according to Eq. (1):

ψi(x,y)=p(x,y)s(xMxi,yMyi),
where (xi, yi) is the ith lateral offset of the specimen and M is the magnification at the plane of the diffraction-capture CCD. This is not the same as the magnification of the objective lens, since the CCD is offset from the microscope image plane. Rather, it is a measured quantity that we ascertain for our setup by removing the diffuser, replacing the sample with a USAF 1951 test chart, and measuring the feature sizes on the resulting magnified image recorded by the diffraction-capture CCD.

The exit wave ψi(x, y) is propagated the short distance to the detector plane using the angular spectrum propagator given by Eq. (2):

Az{ψ(x,y)}=1{{ψ(x,y)}×exp[2iπz(λ2u2v2)1/2]},
where (u, v) are discrete reciprocal space coordinates, ℱ is the discrete Fourier Transform operator, and λ the laser illumination wavelength. Because we scale the sample positions by the magnification, M, at the plane of the diffraction-capture CCD, the effective camera length, z, in Eq. (2) should also be scaled by the ratio of M to the magnification of the microscope objective used. This avoids a strong phase curvature appearing on the reconstruction of the diffuser [13]. For our 20× objective and with a measured distance between the diffuser and the CCD of 5cm we calculated a modified camera length of 5.7cm.

Once ψi(x, y) has been propagated to the detector, its modulus is replaced with the square root of the ith measured diffraction pattern. This corrected wavefront is propagated back to the image plane, with the reverse angular spectrum propagator, to obtain a revised exit wave, ψ′i(x, y). Updated estimates of the sample image and the diffuser, denoted s′(xMxi, yMyi) and p′(x, y) respectively, are calculated according to the ePIE update functions [20] given in Eqs. (3):

s(xMxi,yMyi)=s(xMxi,yMyi)+αp*(x,y)|p(x,y)|max2(ψi(x,y)ψi(x,y)),p(x,y)=p(x,y)+βs*(xMxi,yMyi)|s(x,y)|max2(ψi(x,y)ψi(x,y)),
where * indicates complex conjugation. The next iteration takes these updated estimates and repeats the process using the (i + 1)th diffraction pattern measurement (usually the order in which diffraction patterns are considered is randomised). The algorithms continues by running through all the diffraction data in this manner a predetermined number of times.

4. Results

To demonstrate the efficacy of our near-field ptychographic microscope, we used a range of samples: a mounted slide containing red blood cells, glass microspheres suspended in index-matching fluid, and a singlet optical lens. These samples respectively demonstrate the ability of our microscope to image biological specimens using a small number of data points, to image thick specimens, and to accurately image a wide range of spatial frequencies over a large field of view.

4.1. Red blood cells

To assess the performance of our system as the number of diffraction patterns was reduced, a prepared sample of frog’s red blood cells was used. For this test, we fitted a 20× objective lens with NA=0.40, whose magnification, M, at the plane of the diffraction-capture CCD was measured as 26.85×. 100 diffraction patterns were captured as described above with the blood sample in place.

The ptychographical dataset was processed using the ePIE algorithm, with update parameters α = β = 1, to give the phase image shown in Fig. 3(a). The algorithm also produced the complex-valued reconstruction of the diffuser shown in Fig. 3(b), which was used as an initial estimate in all our subsequent tests. This calibration step reduces the iterations required for the algorithm to converge and, as the number of diffraction patterns is reduced, aids the phase retrieval process in avoiding stagnation.

 figure: Fig. 3

Fig. 3 Phase reconstructions of frog’s red blood cells. For each subfigure (a), (c–g), the image on the left is the full field-of-view (Scale bar = 50μm), and the image on the right is the zoomed-in portion indicated by the boxes (Scale bar = 20μm). (h) shows cross-sections through a single red blood cell, taken along the lines indicated in each of subfigures (a), (c–g).

Download Full Size | PDF

We next collected a second dataset of only 25 diffraction patterns and repeated the reconstruction, this time using the diffuser profile reconstructed in the first experiment as an initial estimate of p(x, y). Since we expect the probe to change very little between experiments, we reduced the step size in the probe update function of Eqs. (3) to β = 10−4. ePIE this time generated the phase image shown in Fig. 3(c). We then discarded a further five diffraction patterns from the data set and repeated the reconstruction, giving the phase image in Fig. 3(d). Discarding more and more of the diffraction patterns resulted in Figs. 3(e)–3(g). Visually, the image quality is good down to six patterns; substructures within the cells can be observed and the background is smooth. Below six diffraction patterns, the structures of the cells themselves can be observed clearly, but the background starts to become noisy. Below four diffraction patterns, the reconstruction becomes poorly conditioned and swamped by noise. Despite increasing background noise, the phase accuracy of the reconstructions remains consistent, even for the four diffraction pattern case. To demonstrate this, Fig. 3(h) shows cross-sections dissecting a single cell, along the line shown in the zoomed in reconstructions in Figs. 3(a) and 3(c–g).

In terms of data collection time, using unoptimised code running through the MATLAB image acquisition toolbox it takes approximately 0.1s to move our specimen and capture a diffraction pattern at 200μs exposure. The total time simply multiplies this figure by the number of diffraction patterns, giving a current usable maximum data collection rate of ∼ 2Hz. Data reconstruction is extremely fast when the good initial diffuser estimate from a calibration run is used to seed ePIE. For the case of four diffraction patterns, using the MATLAB parallel processing toolbox and an Nvidia GeForce 1080 GPU, ePIE converges within ten to twenty iterations, which take between 0.7 and 1.2s (including position correction and background compensation). Our current work aims to obtain sub-second combined acquisition/reconstruction times through further software optimisation, hardware triggering, and parallel data collection and reconstruction.

One of the advantages of using near-field ptychography with a standard microscope platform is that it is very simple to change the magnification. To demonstrate this, the same red blood cells sample was used, and the 20× objective lens replaced by a 4× lens of NA=0.10 (whose magnification at the diffraction-capture CCD was calibrated as 5.55×). In this case, we collected ten diffraction patterns (taking ∼ 1s) using the same scan positions as in the previous experiments. An identical reconstruction process, using the diffuser reconstruction from the previous experiments to seed the algorithm, generated the image in Fig. 4. (Note that not all the cells are in focus in this large field-of-view.)

 figure: Fig. 4

Fig. 4 The same frog’s red blood cells from Fig. 3, using an objective lens with 4× magnification. 10 diffraction patterns, captured in 1s, were used to obtain this image. (Scale bar = 200μm).

Download Full Size | PDF

4.2. Glass microspheres

The sample thickness that can be handled by both ptychography and Fourier ptychography is limited to a region where the interaction of the sample with an illumination function is accurately modelled by a multiplication – the multiplicative approximation. Beyond this limit, conventional reconstruction algorithms fail and computationally expensive alternatives such as multi-slice ptychography must be employed [23, 24]. In our configuration, the equivalent interaction is between the image of the sample produced by the microscope and the optically thin diffuser, which can always be modelled by a multiplication as in Eq. (1). Consequently, thicker samples have no effect on the accuracy or success of the reconstruction process (although they may not be imaged entirely in focus). In this way our ptychographically-produced images are more like those of digital holography, and similarly can be propagated to different focal planes, processed to remove aberrations, and so forth.

To show the capability of our method for dealing with optically thick samples, we collected a dataset of 25 diffraction patterns, in the manner detailed above, using the 20×, NA=0.4 objective lens and, as a sample, 210μm-diameter glass microspheres (RI: 1.51–1.52) suspended in index-matching oil (RI: 1.5124). This sample is much thicker than the limit imposed on conventional ptychography, which in this case is approximately 30μm (accounting for the refractive index of the oil) [25]. This thickness limit is broadly in line with that of Fourier ptychography discussed in [10], although there the limit was found to depend on both the NA of the objective lens and the maximum incidence angle of the set of illumination plane waves, so it is difficult to equate directly with our setup.

To reconstruct the data collected from the microspheres, we found it advantageous to use ‘mPIE’ – the momentum-accelerated version of ePIE [26]. For samples such as this that include multiple phase wraps, the reconstructions from ePIE are prone to phase vortices (as has also been noted in the case of the Difference Map algorithm for X-ray near-field ptychography [27]), and mPIE’s ability to escape local minima reduces these artefacts substantially. We used the diffuser reconstruction from our red blood cell experiments to seed the algorithm, and substantially reduced the step size in the ePIE update functions, using values of α = 0.2 and β = 10−4. With reference to the mPIE paper, for the momentum acceleration we used a batch size equal to the number of diffraction patterns in our experiment (25) and ηobj = 0.99, ηprb = 0.

Figure 5 shows the resulting unwrapped phase reconstruction of the microspheres. They are imaged with both an excellent spherical profile and detailed high resolution features on the spheres’ surface. The phase shift at the centre of the central sphere is 7.5rad, equating to a refractive index difference between the oil and the glass of 0.0038 - well within the specified range. Dust and grease on the surfaces of the microscope slide are also reconstructed, although they are somewhat out of focus. Such features would cause artefacts in both conventional and Fourier ptychography.

 figure: Fig. 5

Fig. 5 210μm diameter glass microspheres suspended in index matching oil. This image was reconstructed from 100 diffraction patterns, captured with a 20× magnification, 0.4NA objective lens. (Scale bar = 50μm).

Download Full Size | PDF

4.3. Singlet lens

As a final demonstration, we used as a sample a Thorlabs LA1131-A, 1 inch diameter, 50mm focal length plano-convex lens. This singlet lens was chosen as the exit wave phase shift relates directly to the known lens curvature, allowing us to assess the accuracy of the reconstructed phase. The phase will increase in spatial frequency with distance from the lens center, so this sample also gives a good test of spatial frequency response.

The lens was secured to the x–y stage such that the centre of the lens was approximately centred on the optical path of the microscope. 100 diffraction patterns were captured using the experimental procedure described above, but the step size in the scan spiral was expanded by a factor of four to extend the field of view. Again, mPIE was used to reconstruct this data to avoid phase vortices associated with the large number of phase wraps, and the original diffuser reconstruction from the red blood cell experiment was used to seed the reconstruction.

A central cut-out from the resulting reconstruction, of diameter 3000 pixels, is shown in Fig. 6(a). The apparent center of the lens is offset from the center of the reconstructed field of view, due to a combination of a slight tilt and shift of the lens mount. Fig. 6(b) shows a comparison between the expected spherical profile of the lens, calculated from its data sheet, to an unwrapped radial average of the reconstructed phase profile. The blue line shows the radial average of the ptychographical reconstruction of the lens and the blue shaded region shows its standard deviation. Given that the radial average includes any dust or grease on the surface of the lens, this shows excellent agreement between the measurements taken using near-field optical ptychography and the expected profile of the lens over the whole profile.

 figure: Fig. 6

Fig. 6 (a) A quantitative phase reconstruction of a singlet lens, focal length 50mm. (b) A comparison of the expected profile of the lens with the unwrapped reconstruction using near-field ptychography. The blue line is the radial average of the height change across the lens and the blue shaded surrounded is the radial standard deviation. The orange line is the model for the lens, based on its focal length.

Download Full Size | PDF

5. Conclusion

In this paper, we have demonstrated a versatile new quantitative phase microscope based on near-field ptychography. Through the use of three test objects, red blood cells, glass microspheres and a singlet lens, we have shown that our system generates high quality, accurate phase images. We have demonstrated that large fields-of-view can be obtained with significantly fewer diffraction patterns than is the case for conventional ptychography, and therefore significantly higher frame rates and smaller reconstruction overheads. We have also shown that optically thick samples can be imaged using our method. This is because it relies upon the interference generated between a thin diffuser and an image of the sample, which always satisfies the multiplicative approximation built in to ptychographic reconstruction algorithms. In contrast, other variants of ptychography rely on interference between the sample itself and a varying or structured illumination, which, as the sample thickens, quickly reaches the limits of validity of the multiplicative approximation.

Our technique is simple, and can be created as an ‘add-on’ for standard microscopes. We have found it robust to a range of parameter values (such as scan patterns and camera lengths) and tolerant to component misalignments. Our future work aims to maximise the speed of the system and apply a similar method to the electron microscope, where most samples are optically thick.

Funding

Engineering and Physical Sciences Research Council (EPSRC).

References

1. K. G. Phillips, S. L. Jacques, and O. J. T. McCarty, “Measurement of single cell refractive index, dry mass, volume, and density using a transillumination microscope,” Phys. Rev. Lett. 109, 118105 (2012). [CrossRef]   [PubMed]  

2. G. Popescu, Quantitative phase imaging of cells and tissues (McGraw-Hill, 2011).

3. Y.-L. Lee, Y.-C. Lin, H.-Y. Tu, and C.-J. Cheng, “Phase measurement accuracy in digital holographic microscopy using a wavelength-stabilized laser diode,” J. Opt. 15, 025403 (2013). [CrossRef]  

4. Z. Wang, L. Millet, M. Mir, H. Ding, S. Unarunotai, J. Rogers, M. U. Gillette, and G. Popescu, “Spatial light interference microscopy (SLIM),” Opt. Express 19, 1016–1026 (2011). [CrossRef]   [PubMed]  

5. P. Bon, G. Maucort, B. Wattellier, and S. Monneret, “Quadriwave lateral shearing interferometry for quantitative phase microscopy of living cells,” Opt. Express 17, 13080–13094 (2009). [CrossRef]   [PubMed]  

6. L. Waller, L. Tian, and G. Barbastathis, “Transport of intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express 18, 12552–12561 (2010). [CrossRef]   [PubMed]  

7. J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85, 4795–4797 (2004). [CrossRef]  

8. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7, 739–745 (2013). [CrossRef]  

9. P. Thibault, M. Dierolf, A. Menzel, O. Bunk, C. David, and F. Pfeiffer, “High-resolution scanning X-ray diffraction microscopy,” Science 321, 379–382 (2008). [CrossRef]   [PubMed]  

10. L. Tian, Z. Liu, L.-H. Yeh, M. Chen, J. Zhong, and L. Waller, “Computational illumination for high-speed in vitro Fourier ptychographic microscopy,” Optica 2, 904–911 (2015). [CrossRef]  

11. L. Tian, X. Li, K. Ramchandran, and L. Waller, “Multiplexed coded illumination for Fourier ptychography with an LED array microscope,” Biomed. Opt. Express 5, 2376–2389 (2014). [CrossRef]   [PubMed]  

12. J. Sun, C. Zuo, J. Zhang, Y. Fan, and Q. Chen, “High-speed Fourier ptychographic microscopy based on programmable annular illuminations,” Sci. Reports 8, 7669 (2018). [CrossRef]  

13. M. Stockmar, P. Cloetens, I. Zanette, B. Enders, M. Dierolf, F. Pfeiffer, and P. Thibault, “Near-field ptychography: phase retrieval for inline holography using a structured illumination,” Sci. Reports 3, 1927 (2013). [CrossRef]  

14. A.-L. Robisch, K. Kröger, A. Rack, and T. Salditt, “Near-field ptychography using lateral and longitudinal shifts,” New J. Phys. 17, 073033 (2015). [CrossRef]  

15. J. Marrison, L. Räty, P. Marriott, and P. O’Toole, “Ptychography – a label free, high-contrast imaging technique for live cells using quantitative phase information,” Sci. Reports 3, 2369 (2013). [CrossRef]  

16. P. Ferrand, A. Baroni, M. Allain, and V. Chamard, “Quantitative imaging of anisotropic material properties with vectorial ptychography,” Opt. Lett. 43, 763–766 (2018). [CrossRef]   [PubMed]  

17. A. M. Maiden, M. C. Sarahan, M. D. Stagg, S. M. Schramm, and M. J. Humphry, “Quantitative electron phase imaging with high sensitivity and an unlimited field of view,” Sci. Reports 5, 14690 (2015). [CrossRef]  

18. R. M. Clare, M. Stockmar, M. Dierolf, I. Zanette, and F. Pfeiffer, “Characterization of near-field ptychography,” Opt. Express 23, 19728–19742 (2015). [CrossRef]   [PubMed]  

19. X. Huang, H. Yan, R. Harder, Y. Hwu, I. K. Robinson, and Y. S. Chu, “Optimization of overlap uniformness for ptychography,” Opt. Express 22, 12634–12644 (2014). [CrossRef]   [PubMed]  

20. A. M. Maiden and J. M. Rodenburg, “An improved ptychographical phase retrieval algorithm for diffractive imaging,” Ultramicroscopy 109, 1256–1262 (2009). [CrossRef]   [PubMed]  

21. A. Maiden, M. Humphry, M. Sarahan, B. Kraus, and J. Rodenburg, “An annealing algorithm to correct positioning errors in ptychography,” Ultramicroscopy 120, 64–72 (2012). [CrossRef]   [PubMed]  

22. S. McDermott, P. Li, G. Williams, and A. Maiden, “Characterizing a spatial light modulator using ptychography,” Opt. Lett. 42, 371–374 (2017). [CrossRef]   [PubMed]  

23. P. Li and A. Maiden, “Optical ptychography with extended depth of field,” J. Physics: Conf. Ser. 902, 012015 (2017).

24. L. Tian and L. Waller, “3D intensity and phase imaging from light field measurements in an LED array microscope,” Optica 2, 104–111 (2015). [CrossRef]  

25. E. H. R. Tsai, I. Usov, A. Diaz, A. Menzel, and M. Guizar-Sicairos, “X-ray ptychography with extended depth of field,” Opt. Express 24, 29089–29108 (2016). [CrossRef]   [PubMed]  

26. A. Maiden, D. Johnson, and P. Li, “Further improvements to the ptychographical iterative engine,” Optica 4, 736–745 (2017). [CrossRef]  

27. M. Stockmar, I. Zanette, M. Dierolf, B. Enders, R. Clare, F. Pfeiffer, P. Cloetens, A. Bonnin, and P. Thibault, “X-Ray near-field ptychography for optically thick specimens,” Phys. Rev. Appl. 3, 014005 (2015). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 The experimental setup of our near-field ptychographic microscope. An expanded 675nm laser beam illuminates the sample, which is mounted on a mechanical x–y stage and moves independently to the rest of the components. A weak diffuser of transparent adhesive tape is placed in the image plane of a standard optical microscope and a CCD placed 5cm downstream captures the resulting near-field diffraction patterns. A secondary CCD aids sample focussing.
Fig. 2
Fig. 2 (a) An exemplar diffraction pattern observed using our system when a sample of red blood cells was mounted in the microscope. (b) A typical pseudo-random spiral pattern of x/y stage positions used in the collection of ptychographic data. Data is collected starting in the centre and moving out following the dotted line, the four red positions represent the minimum data collection needed to realise reconstructed images of reasonable quality.
Fig. 3
Fig. 3 Phase reconstructions of frog’s red blood cells. For each subfigure (a), (c–g), the image on the left is the full field-of-view (Scale bar = 50μm), and the image on the right is the zoomed-in portion indicated by the boxes (Scale bar = 20μm). (h) shows cross-sections through a single red blood cell, taken along the lines indicated in each of subfigures (a), (c–g).
Fig. 4
Fig. 4 The same frog’s red blood cells from Fig. 3, using an objective lens with 4× magnification. 10 diffraction patterns, captured in 1s, were used to obtain this image. (Scale bar = 200μm).
Fig. 5
Fig. 5 210μm diameter glass microspheres suspended in index matching oil. This image was reconstructed from 100 diffraction patterns, captured with a 20× magnification, 0.4NA objective lens. (Scale bar = 50μm).
Fig. 6
Fig. 6 (a) A quantitative phase reconstruction of a singlet lens, focal length 50mm. (b) A comparison of the expected profile of the lens with the unwrapped reconstruction using near-field ptychography. The blue line is the radial average of the height change across the lens and the blue shaded surrounded is the radial standard deviation. The orange line is the model for the lens, based on its focal length.

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

ψ i ( x , y ) = p ( x , y ) s ( x M x i , y M y i ) ,
A z { ψ ( x , y ) } = 1 { { ψ ( x , y ) } × exp [ 2 i π z ( λ 2 u 2 v 2 ) 1 / 2 ] } ,
s ( x M x i , y M y i ) = s ( x M x i , y M y i ) + α p * ( x , y ) | p ( x , y ) | max 2 ( ψ i ( x , y ) ψ i ( x , y ) ) , p ( x , y ) = p ( x , y ) + β s * ( x M x i , y M y i ) | s ( x , y ) | max 2 ( ψ i ( x , y ) ψ i ( x , y ) ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.