Abstract
The widely used lenslet-bound definition of the Shack-Hartmann wavefront sensor (SHWS) dynamic range is based on the permanent association between groups of pixels and individual lenslets. Here, we formalize an alternative definition that we term optical dynamic range, based on avoiding the overlap of lenslet images. The comparison of both definitions for Zernike polynomials up to the third order plus spherical aberration shows that the optical dynamic range is larger by a factor proportional to the number of lenslets across the SHWS pupil. Finally, a pre-centroiding algorithm to facilitate lenslet image location in the presence of defocus and astigmatism is proposed. This approach, based on the SHWS image periodicity, is demonstrated using optometric lenses that translate lenslet images outside the projected lenslet boundaries.
© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
1. Introduction
The Shack-Hartmann wavefront sensor (SHWS) [1,2] is widely used in science [3–10], industry [11–13] and medicine [14–17]. This device samples wavefronts by estimating the centroids of images formed by an array of lenslets onto a pixelated detector. When the pixelated sensor is at the geometrical back focal plane [18] of a uniformly-illuminated lenslet [19], the displacement of a lenslet image centroid ${\boldsymbol \rho }$ from a reference position is proportional to the gradient of the wavefront $W({\boldsymbol r} )$ averaged over the lenslet [20],
Here, ${f_l}$ is the lenslet focal length and the double integrals are evaluated over the lenslet area.
A critical design specification for SHWSs is the dynamic range for aberrations of interest. Despite its importance, there are different, and often misunderstood SHWS dynamic range definitions. Here we first examine the commonly used lenslet-bound definition, which is based on the permanent association between non-overlapping groups of pixels and the corresponding lenslets. The resulting modest dynamic range has motivated the development of numerous SHWS variations [21–29], often based on the sequential sampling of pupil regions [24,30–37], and with some being substantially different and/or more complex instruments [38–45]. By removing the permanent association between pixels and lenslets, which can be thought of as a software [46–48] or hardware limitation, we formalize a previously proposed definition based on avoiding the overlap of adjacent lenslet images [49–57]. Because this definition does not include the pixelated sensor, we refer to it as the SHWS optical dynamic range. This definition is followed by a comparison of both the lenslet-bound and optical dynamic ranges, first for an arbitrary wavefront, and then, for each Zernike polynomial up to the third order plus spherical aberration. Then, we derive the relation between the lenslet image lattice vectors for periodic SHWS lenslet arrays and the Zernike polynomial coefficients for defocus and astigmatism. Finally, we use this relation to demonstrate a pre-centroiding algorithm that facilitates the assignment of lenslet images to groups of pixels on which to centroid in order to obtain a precise wavefront estimation.
2. SHWS dynamic range based on wavefront first derivative
The pixelated sensors of the first SHWSs consisted of arrays of pixel arrays with their output wired to readout and processing electronics dedicated to each lenslet [58]. This permanent association between pixels and lenslets led to a lenslet-bound definition of dynamic range in terms of the maximum wavefront slope that would shift a lenslet image towards the edge of its pixel group, as depicted in Fig. 1 [13,23,25,38,45,59–63]. In this way, for a group of pixels with diameter ${D_l}$, the maximum wavefront slope averaged over a lenslet $|{{\theta_{\textrm{max}}}} |= \left|{\smallint\!\!\!\smallint \nabla \textrm{W}({\boldsymbol r} ){\textrm{d}^2}{\boldsymbol r}} \right|$ before the lenslet image of width ${D_i}$ reaches the edge of the pixel group is,
where it is assumed that the image position for a flat wavefront is at the center of the pixel group. These groups are often chosen as consisting of all pixels within the projection of the lenslet boundary on to the pixelated sensor, in which case ${D_l}$ is the lenslet pitch, which coincides with its diameter for lenslets with 100% fill factor.The condition in Eq. (2) can be used to calculate the maximum measurable amplitude ${a_j}$ of a wavefront ${Z_j}({x,y} )$ over the SHWS pupil $\mathrm{\Omega }$, by solving
For hexagonal lenslets, this condition must be modified to consider the lenslet image reaching any of the six lenslet sides along the lines defined by $\alpha ={-} \pi /3$, 0 and $\pi /3$. Now considering that wavefront slope variation can often be considered negligible across individual lenslets, we can generalize this definition by maximizing the directional wavefront derivative over a circle inscribed in the lenslet, that is,
3. SHWS dynamic range based on wavefront second derivative
Most SHWSs capture all the lenslet images with a single two-dimensional array of pixels, and thus, are not limited by the permanent association between pixels and lenslets. This allows for a dynamic range definition based on avoiding the partial overlap of lenslet images, irrespective of their position on the pixelated sensor. Here, we formalize this idea, independently suggested by multiple authors [49–57], to which we refer as the optical dynamic range.
Let us start by assuming that two identical adjacent lenslets, indexed k and $k + 1$, without loss of generality have their centers (${x_k},{y_k}$) along a line parallel to the x-axis (i.e., ${y_{k + 1}} = {y_k}$) and with ${x_{k + 1}} > {x_k}$. As depicted in Fig. 2, for such images to not overlap the following condition between angular displacements ${\theta _{x,k}}$ and ${\theta _{x,k + 1}}$ must be met
with ${d_l}$ being the distance between the lenslets, which coincides with the lenslet diameter ${D_l}$ if the lenslet fill factor is 100%. If the lenslets are small relative to the period of the maximum spatial frequency of a wavefront $W({x,y} )$ with x and y normalized by the SHWS pupil radius (${D_{p}}/2)$, we can approximate the angular difference in terms of the wavefront second derivative as follows (see Appendix A)Therefore, the maximum positive measurable amplitude $b_j^ + $ of a wavefront described by function ${Z_j}({x,y} )$ that avoids lenslet image overlap along the $x$-direction can be calculated by solving
When the lenslet image is smaller than the lenslet separation (i.e., ${d_l} > {D_i})$ and because $b_j^ + > 0$, this condition can only be met if the function second derivative is negative somewhere within the pupil. Otherwise, the positive limit of the optical dynamic range $b_j^ + \; $ is +$\infty $, as it is the case for defocus. In practice, of course, no SHWS can measure an infinitely large amount of positive defocus because the lenslet images will eventually reach the edge of the pixelated sensor. The same reasoning that led to Eq. (9) applied to a negative amplitude $b_j^ - $ yields
Again, when ${d_l} > {D_i}$, and because $b_j^ - < 0$, this condition has a solution if the wavefront second derivative is positive somewhere within the pupil. From these two conditions, it is important to note that it is possible to have different minimum and maximum measurable amplitudes for the same wavefront aberration, that is, the optical dynamic range can be asymmetric.
The conditions in Eqs. (9) and (10), however, only consider the $x$-axis direction. As Fig. 3 depicts for a square lenslet array, a lenslet image (green circle) could be shifted by aberrated wavefront towards images of adjacent or even distant lenslets (red circles) along the directions shown by the (orange) line segments. Using the second directional derivative $\nabla _\alpha ^2W$ defined as ${\nabla _\alpha }({{\nabla_\alpha }W} )$, we can generalize the condition for avoiding the lenslet images along the direction defined by the angle $\alpha $ as
4. Dynamic range definition comparison for low order Zernike polynomials
Let us now compare the two SHWS dynamic range definitions discussed above for the Zernike polynomials up to the third order and spherical aberration [64] through their ratios, noting that
For positive wavefront amplitudes, we have
Analytical evaluation of these ratios over a circular pupil of radius are shown in Table 1 below, with the fourth column values calculated using Eq. (5), the fifth column using Eqs. (13) and (14), and the sixth column using Eqs. (16) and (17). As expected for tip and tilt, the second definition yields an infinite optical dynamic range because these aberrations shift all the lenslet images equally without changing their separation. Also, as mentioned earlier, defocus has an asymmetric optical dynamic range, infinite towards the positive amplitudes because these wavefronts separate the lenslet images, and finite towards negative amplitudes because the lenslet images are brought closer together. In practice, the infinite ends of the SHWS optical dynamic range are truncated by either the pixelated sensor finite size or the increased size of the lenslet images. Third order aberrations have symmetric finite optical dynamic ranges, while spherical aberration has an asymmetric finite optical dynamic range.
5. SHWS lattice vectors in the presence of tip, tilt, defocus, and astigmatism
In order to take full advantage of the optical dynamic range, the SHWS image processing should include a pre-centroiding step, in which the coarse location of individual lenslet images is determined and assigned to the corresponding lenslets. This can be achieved by exploiting the fact that when the wavefronts are within the SHWS optical dynamic range, the lenslet images are monotonically sorted along the x- and y-axis, and in most cases, forming a 2-dimensional Bravais square or hexagonal lattice. When this is the case, if a single lenslet image is found, then the other lenslet images can be coarsely found by moving across the pixelated sensor in integer combinations of the Bravais lattice vectors. Tip and tilt will shift all lenslet images equally, thus preserving the lattice vectors. Defocus, vertical and oblique astigmatism, on the other hand, will change the lattice vectors as it is calculated next.
Let us start by defining the lenslet lattice vectors as ${{\boldsymbol v}_1} = {\boldsymbol \; }({{v_{1,x}},{v_{1,y}}} )$ and ${{\boldsymbol v}_2} = {\boldsymbol \; }({{v_{2,x}},{v_{2,y}}} )$ that describe the SHWS lenslet image lattice resulting from a flat wavefront. The lattice vectors ${\boldsymbol v}_1^{^{\prime}}$ and ${\boldsymbol v}_2^{^{\prime}}$ for an aberrated wavefront in normalized pupil coordinates $W({2x/{D_p},2y/{D_p}} )$, can be approximated using the wavefront derivatives at the center of the ith lenslet $({{x_i},{y_i}} )$, instead of the average lenslet value over each lenslet, as
Let us now consider a wavefront, described by a polynomial of the form $A{x^2} + Bxy + C{y^2} + Dx + Ey + F$, as a linear combination of defocus, astigmatism, tip, tilt, and piston. The SHWS image lattice vectors that would result from such wavefront can be calculated using Eqs. (18) and (19) as
These new vectors do not depend on the lenslet coordinates $({{x_i},{y_i}} )$, and thus, the 2D Bravais lenslet image lattice is preserved. Now, substituting the OSA definition of Zernike polynomials [64] in Eq. (20) and Eq. (21) for oblique astigmatism with amplitude ${b_3}$, defocus with amplitude ${b_4}$, and vertical astigmatism with amplitude ${b_5}$ for an initial square lattice along the x- and y-axis, the lattice vectors of the SH image lattice are
If the lattice vectors are experimentally estimated (see Fig. 4), then the x- and y-components of these vector equations form a system of four linear equations with three unknown aberration amplitudes. These amplitudes can be calculated either analytically or numerically using linear algebra.
The linearity of the lattice vector change with defocus and astigmatism, together with the array theorem can be used, together with the discrete Fourier transform (DFT) [65], to estimate the amplitudes of defocus and astigmatism from a raw SHWS image as follows. First, we calculate the absolute value of the DFT of the SHWS image, and then, we locate the two maxima nearest to the zero-frequency term (DC) that are not axisymmetric. The vector positions of these maxima relative to the zero spatial frequency (${\boldsymbol u}_1^{\prime}$ and ${\boldsymbol u}_2^{\prime}$) are the reciprocal lattice vectors, which can be used to calculate the actual lattice vectors [66], as
The direct and reciprocal lattice vectors of SHWS images captured with square lenslet arrays with and without a convex cylinder optometric lens are shown in Fig. 4.
The value of this algorithm is not in the estimation of defocus and astigmatism coefficients, which is coarse due to the finite SHWS image sampling, but rather on the facilitation of the location of the SHWS lenslet images to assign a group of pixels to each SHWS lenslet image. This can be achieved simply by finding a single lenslet image (e.g., the brightest), and then move along the image by integer combinations of the lattice vectors. The resulting locations would then be used as the center of a group of pixels used to estimate the lenslet image centroid, which will allow precise estimation of the wavefront.
Here it is important to note that if the wavefront has third or higher order polynomial wavefront components, the periodicity of the lenslet image lattice will degrade. When such distortion is small, that is, if the lenslet images remain within the cell associated to the previously estimated lattice vector, then regions of interest on the pixelated sensor centered on the lattice cell centers will suffice to initiate the lenslet image centroid calculations.
6. Experiments
Wavefronts with defocus and astigmatism were measured with a custom SHWS depicted at the top of Fig. 5, consisting of a EXi Aqua camera (Teledyne Qimaging, Surrey, BC, Canada) with an array of 7.6 mm geometrical focal length and 300 µm pitch (Adaptive Optics Associates, now part of AOA Xinetics, Devens, MA, USA) lenslets focused to account for their low Fresnel number [18]. Light from a 637 nm S1FC637 laser diode (Thorlabs, Newton, NJ, USA) delivered by a single-mode optical fiber was collimated with an achromatic doublet to illuminate the first of three pupil planes with an approximately plane wavefront. An iris diaphragm, optometric lenses, and the SHWS lenslet array were placed in the three pupil planes relayed by afocal telescopes formed by achromatic doublets, as shown in Fig. 5.
SHWS images captured the wavefronts generated through sphere lenses with optical power ${P_D}$ in the ±12 D range with 1 D steps, which due to the $0.43$ magnification between the lens plane and the lenslet array plane the power scaled to ±65.3 D (${P_D}^{\prime} = {P_D}/{M^2}$). The lattice vectors were calculated for each SHWS image and their components were compared with the prediction of Eqs. (22) and (23), with the results plotted in Fig. 5. The spacing between vertical light gray lines in this plot correspond to the lenslet-bound dynamic range, calculated using Eq. (5), modeling the lenslet image width as 2× the full-width at half-maximum of a Gaussian beam (see Eq. (19) in Ref. [67]). The positive end of the optical dynamic range, denoted by the vertical orange line on the right, is approximately 13 times larger than the separation of the vertical gray lines (lenslet-bound dynamic range). The data itself, shows that the dominant aberration, defocus (${b_4}$), is estimated with a maximum of 11% error over a range 8 times larger than the lenslet-bound definition.
We did not test larger positive defocus amplitudes because the metal housing in which the lenslet array was mounted vignetted the beam (outer lenslets). A small artifactual astigmatism (mean 2.3%), likely due to trial lens centration inaccuracies and coarse DFT interpolation, was estimated. Although the lowest end of the SHWS optical defocus dynamic range is $- \infty $, in practice this lower end is determined by whenever a lenslet images reaches the edge of the pixelated sensor’s region of interest ${D_{\textrm{ROI}}}$, that is when spacing between adjacent lenslet images in the presence of defocus sl meets the condition,
This limit is shown in the plot in Fig. 5 as cyan vertical line.
The results of a similar experiment using convex cylinder optometric lenses oriented at 45° to generate oblique astigmatism and defocus are shown in Fig. 6. Accounting for pupil magnification, the optical powers at the SHWS lenslet array plane were between 5.4 and 32.7 D, which spans almost 6 times the lenslet-bound dynamic range definition.
7. Summary
A definition of the SHWS optical dynamic range in terms of avoiding lenslet image overlap was formalized and compared with the widely used definition based on restricting each lenslet image to a fixed group of pixels. The optical dynamic range is larger by a factor proportional to the number of lenslets across the pupil and provides a method for calculating the minimum number of lenslets across its pupil for measuring a desired wavefront aberration amplitude. The proposed formulation of the optical dynamic range definition in terms of the extreme values of the directional wavefront curvature within circles inscribed in lenslets is applicable to lenslets of any shape and array geometry, even if non-periodic.
A pre-centroiding algorithm based on the estimation of the SHWS image lattice vectors was proposed to facilitate lenslet image location in the presence of large defocus and astigmatism amplitudes. This algorithm was demonstrated using optometric trial lenses that displaced the SHWS lenslet images well beyond the projection of the lenslet boundary onto the SHWS pixelated sensor.
Appendix A
Let $W({x^{\prime},y^{\prime}} )$ be a wavefront with $x^{\prime}$ and $y^{\prime}$ unnormalized SHWS pupil coordinates, and x and y be the coordinates normalized by the SHWS pupil radius, that is, $x = 2x^{\prime}/{D_p}$ and $y = 2y^{\prime}/{D_p}$. Now, the gradient of the wavefront along x-axis is given by,
Let us assume two identical adjacent lenslets with their centers along the x-axis and the second lenslet having the larger abscissa. Then, the difference between the lenslet image angular coordinate can be written as follows,
Funding
National Eye Institute (P30EY026877, R01EY025231, R01EY031360, R01EY027301); Research to Prevent Blindness (Challenge Grant).
Acknowledgments
The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Disclosures
The authors declare no conflicts of interest.
References
1. J. Hartmann, “Bemerkungen über den bau und die justierung von spektrographen,” Z. Instrumentenkd 20, 47–58 (1900).
2. R. V. Shack and B. C. Platt, “Production and use of a lenticular Hartmann screen,” J. Opt. Soc. Am. 61(5), 656 (1971).
3. J. Liang, D. R. Williams, and D. Miller, “Supernormal vision and high-resolution retinal imaging through adaptive optics,” J. Opt. Soc. Am. A 14(11), 2884–2892 (1997). [CrossRef]
4. G. Y. Yoon and D. R. Williams, “Visual performance after correcting the monochromatic and chromatic aberrations of the eye,” J. Opt. Soc. Am. A 19(2), 266–275 (2002). [CrossRef]
5. A. Roorda, F. Romero-Borja, W. Donnelly III, H. Queener, T. Hebert, and M. Campbell, “Adaptive optics scanning laser ophthalmoscopy,” Opt. Express 10(9), 405–412 (2002). [CrossRef]
6. P. Artal, L. Chen, E. J. Fernández, B. Singer, S. Manzanera, and D. R. Williams, “Neural compensation for the eye's optical aberrations,” J. Vis. 4(4), 4–287 (2004). [CrossRef]
7. P. L. Wizinowich, D. Le Mignant, A. H. Bouchez, R. D. Campbell, J. C. Y. Chin, A. R. Contos, M. A. van Dam, S. K. Hartman, E. M. Johansson, R. E. Lafon, H. Lewis, P. J. Stomski, and D. M. Summers, “The W. M. Keck Observatory laser guide star adaptive optics system: Overview,” Publ. Astron. Soc. Pac. 118(840), 297–309 (2006). [CrossRef]
8. O. Azucena, J. Crest, S. Kotadia, W. Sullivan, X. Tao, M. Reinig, D. Gavel, S. Olivier, and J. Kubby, “Adaptive optics wide-field microscopy using direct wavefront sensing,” Opt. Lett. 36(6), 825–827 (2011). [CrossRef]
9. A. Dubra, Y. Sulai, J. L. Norris, R. F. Cooper, A. M. Dubis, D. R. Williams, and J. Carroll, “Noninvasive imaging of the human rod photoreceptor mosaic using a confocal adaptive optics scanning ophthalmoscope,” Biomed. Opt. Express 2(7), 1864–1876 (2011). [CrossRef]
10. M. J. Booth, “Adaptive optical microscopy: the ongoing quest for a perfect image,” Light: Sci. Appl. 3(4), e165 (2014). [CrossRef]
11. B. M. Levine, E. A. Martinsen, A. Wirth, A. Jankevics, M. Toledo-Quinones, F. Landers, and T. L. Bruno, “Horizontal line-of-sight turbulence over near-ground paths and implications for adaptive optics corrections in laser communications,” Appl. Opt. 37(21), 4553–4560 (1998). [CrossRef]
12. C. Forest, C. Canizares, D. Neal, M. McGuirk, and M. Schattenburg, “Metrology of thin transparent optics using Shack-Hartmann wavefront sensing,” Opt. Eng. 43(3), 742–753 (2004). [CrossRef]
13. B. Dörband, H. Müller, and H. Gross, “Handbook of Optical Systems,” Volume 5: Metrology of Optical Components and Systems, 1st ed. (John Wiley & Sons, 2012).
14. M. Mrochen, M. Kaemmerer, and T. Seiler, “Wavefront-guided laser in situ keratomileusis: early results in three eyes,” J. Refract. Surg. 16(2), 116–121 (2000).
15. M. Mrochen, M. Kaemmerer, and T. Seiler, “Clinical results of wavefront-guided laser in situ keratomileusis 3 months after surgery,” J. Cataract Refract. Surg. 27(2), 201–207 (2001). [CrossRef]
16. S. Schallhorn, M. Brown, J. Venter, D. Teenan, K. Hettinger, and H. Yamamoto, “Early clinical outcomes of wavefront-guided myopic LASIK treatments using a new-generation Hartmann-Shack aberrometer,” J Refract Surg 30(1), 14–21 (2014). [CrossRef]
17. M. Vinas, C. Benedi-Garcia, S. Aissati, D. Pascual, V. Akondi, C. Dorronsoro, and S. Marcos, “Visual simulators replicate vision with multifocal lenses,” Sci. Rep. 9(1), 1539 (2019). [CrossRef]
18. V. Akondi and A. Dubra, “Accounting for focal shift in the Shack–Hartmann wavefront sensor,” Opt. Lett. 44(17), 4151–4154 (2019). [CrossRef]
19. V. Akondi, S. Steven, and A. Dubra, “Centroid error due to non-uniform lenslet illumination in the Shack–Hartmann wavefront sensor,” Opt. Lett. 44(17), 4167–4170 (2019). [CrossRef]
20. V. Akondi and A. Dubra, “Average gradient of Zernike polynomials over polygons,” Opt. Express 28(13), 18876–18886 (2020). [CrossRef]
21. M. C. Roggemann and T. J. Schulz, “Algorithm to increase the largest aberration that can be reconstructed from Hartmann sensor measurements,” Appl. Opt. 37(20), 4321–4329 (1998). [CrossRef]
22. S. Groening, B. Sick, K. Donner, J. Pfund, N. Lindlein, and J. Schwider, “Wave-front reconstruction with a Shack–Hartmann sensor with an iterative spline fitting method,” Appl. Opt. 39(4), 561–567 (2000). [CrossRef]
23. V. Molebny, “Scanning Shack-Hartmann wavefront sensor,” Proc. SPIE 5412, 66–71 (2004). [CrossRef]
24. L. Seifert, H. J. Tiziani, and W. Osten, “Wavefront reconstruction with the adaptive Shack–Hartmann sensor,” Opt. Commun. 245(1-6), 255–269 (2005). [CrossRef]
25. H. Choo and R. S. Muller, “Addressable Microlens Array to Improve Dynamic Range of Shack–Hartmann Sensors,” J. Microelectromech. Syst. 15(6), 1555–1567 (2006). [CrossRef]
26. Y. Hongbin, Z. Guangya, C. F. Siong, L. Feiwen, and W. Shouhua, “A tunable Shack–Hartmann wavefront sensor based on a liquid-filled microlens array,” J. Micromech. Microeng. 18(10), 105017 (2008). [CrossRef]
27. M. Xia, C. Li, L. Hu, Z. Cao, Q. Mu, and X. Li, “Shack-Hartmann wavefront sensor with large dynamic range,” J. Biomed. Opt. 15(2), 1–10 (2010). [CrossRef]
28. R. Martínez-Cuenca, V. Durán, V. Climent, E. Tajahuerce, S. Bará, J. Ares, J. Arines, M. Martínez-Corral, and J. Lancis, “Reconfigurable Shack–Hartmann sensor without moving elements,” Opt. Lett. 35(9), 1338–1340 (2010). [CrossRef]
29. N. Kumar, A. Khare, and B. Boruah, “Enhanced dynamic range of the grating array based zonal wavefront sensor using a zone wise scanning method,” Proc. SPIE 11287, E1–E6 (2020). [CrossRef]
30. A. Carmichael Martins and B. Vohnsen, “Measuring ocular aberrations sequentially using a digital micromirror device,” Micromachines 10(2), 117 (2019). [CrossRef]
31. M. Aftab, H. Choi, R. Liang, and D. W. Kim, “Adaptive Shack-Hartmann wavefront sensor accommodating large wavefront variations,” Opt. Express 26(26), 34428–34441 (2018). [CrossRef]
32. G.-Y. Yoon, S. Pantanelli, and L. J. Nagy, “Large-dynamic-range Shack-Hartmann wavefront sensor for highly aberrated eyes,” J. Biomed. Opt. 11(3), 1–3 (2006). [CrossRef]
33. S. Pantanelli, S. MacRae, T. M. Jeong, and G. Yoon, “Characterizing the Wave Aberration in Eyes with Keratoconus or Penetrating Keratoplasty Using a High–Dynamic Range Wavefront Sensor,” Ophthalmology 114(11), 2013–2021 (2007). [CrossRef]
34. G. Yoon, “Large dynamic range Shack-Hartmann wavefront sensor,” U.S. patent, 7,414,712 (19 Aug. 2008).
35. S. Olivier, V. Laude, and J.-P. Huignard, “Liquid-crystal Hartmann wave-front scanner,” Appl. Opt. 39(22), 3838–3846 (2000). [CrossRef]
36. V. Laude, S. Olivier, C. Dirson, and J.-P. Huignard, “Hartmann wave-front scanner,” Opt. Lett. 24(24), 1796–1798 (1999). [CrossRef]
37. R. Navarro and E. Moreno-Barriuso, “Laser ray-tracing method for optical testing,” Opt. Lett. 24(14), 951–953 (1999). [CrossRef]
38. G. N. McKay, F. Mahmood, and N. J. Durr, “Large dynamic range autorefraction with a low-cost diffuser wavefront sensor,” Biomed. Opt. Express 10(4), 1718–1735 (2019). [CrossRef]
39. H. Shinto, Y. Saita, and T. Nomura, “Shack-Hartmann wavefront sensor with large dynamic range by adaptive spot search method,” Appl. Opt. 55(20), 5413–5418 (2016). [CrossRef]
40. X. J.-F. Levecq and S. H. Bucourt, “Method and device for analysing a highly dynamic wavefront,” U.S. patent, 6,750,957 (15 June 2004).
41. G. Altmann, “Method and apparatus for improving the dynamic range and accuracy of a Shack-Hartmann wavefront sensor,” U.S. patent application, 10/013, 565 (2003).
42. N. Lindlein and J. Pfund, “Experimental results for expanding the dynamic range of a Shack-Hartmann sensor by using astigmatic microlenses,” Opt. Eng. 41(2), 529–533 (2002). [CrossRef]
43. X. Wei, T. Van Heugten, and L. Thibos, “Validation of a Hartmann-Moiré Wavefront Sensor with Large Dynamic Range,” Opt. Express 17(16), 14180–14185 (2009). [CrossRef]
44. D. Podanchuk, V. Dan’ko, M. Kotov, J.-Y. Son, and Y.-J. Choi, “Extended-range Shack-Hartmann wavefront sensor with nonlinear holographic lenslet array,” Opt. Eng. 45(5), 053605 (2006). [CrossRef]
45. J. Ko and C. C. Davis, “Comparison of the plenoptic sensor and the Shack-Hartmann sensor,” Appl. Opt. 56(13), 3689–3698 (2017). [CrossRef]
46. Z. Gao, X. Li, and H. Ye, “Large dynamic range Shack–Hartmann wavefront measurement based on image segmentation and a neighbouring-region search algorithm,” Opt. Commun. 450, 190–201 (2019). [CrossRef]
47. C. Leroux and C. Dainty, “A simple and robust method to extend the dynamic range of an aberrometer,” Opt. Express 17(21), 19055–19061 (2009). [CrossRef]
48. L. Lundström and P. Unsbo, “Unwrapping Hartmann-Shack Images from Highly Aberrated Eyes Using an Iterative B-spline Based Extrapolation Method,” Optom. Vis. Sci. 81(5), 383–388 (2004). [CrossRef]
49. D. G. Smith and J. E. Greivenkamp, “Generalized method for sorting Shack-Hartmann spot patterns using local similarity,” Appl. Opt. 47(25), 4548–4554 (2008). [CrossRef]
50. D. G. Smith, “High dynamic range calibration for an infrared Shack-Hartmann wavefront sensor,” (University of Arizona, 2008).
51. S. Mauch and J. Reger, “Real-Time Spot Detection and Ordering for a Shack–Hartmann Wavefront Sensor With a Low-Cost FPGA,” IEEE Trans. Instrum. Meas. 63(10), 2379–2386 (2014). [CrossRef]
52. M. Ares, S. Royo, and J. Caum, “Shack-Hartmann sensor based on a cylindrical microlens array,” Opt. Lett. 32(7), 769–771 (2007). [CrossRef]
53. J. Lee, R. V. Shack, and M. R. Descour, “Sorting method to extend the dynamic range of the Shack–Hartmann wave-front sensor,” Appl. Opt. 44(23), 4838–4845 (2005). [CrossRef]
54. W.-W. Lee, J. H. Lee, and C. K. Hwangbo, “Increase of dynamic range of a Shack-Hartmann sensor by shifting detector plane,” Proc. SPIE 5639, 70–77 (2004). [CrossRef]
55. J. Pfund, N. Lindlein, and J. Schwider, “Dynamic range expansion of a Shack–Hartmann sensor by use of a modified unwrapping algorithm,” Opt. Lett. 23(13), 995–997 (1998). [CrossRef]
56. M. Rocktäschel and H. J. Tiziani, “Limitations of the Shack–Hartmann sensor for testing optical aspherics,” Opt. Laser Tech. 34(8), 631–637 (2002). [CrossRef]
57. C. E. Campbell, “The range of local wavefront curvatures measurable with Shack-Hartmann wavefront sensors,” Clin. Exp. Optom. 92(3), 187–193 (2009). [CrossRef]
58. J. W. Hardy, Adaptive Optics for Astronomical Telescopes (Oxford University Press, 1998).
59. G. Yoon, “Wavefront sensing and diagnostic uses,” in Adaptive Optics for Vision Science (Wiley, 2006), pp. 63–81.
60. A. Nikitin, J. Sheldakova, A. Kudryashov, G. Borsoni, D. Denisov, V. Karasik, and A. Sakharov, “A device based on the Shack-Hartmann wave front sensor for testing wide aperture optics,” Proc. SPIE 9754, 97540K (2016). [CrossRef]
61. Y. Saita, H. Shinto, and T. Nomura, “Holographic Shack-Hartmann wavefront sensor based on the correlation peak displacement detection method for wavefront sensing with large dynamic range,” Optica 2(5), 411–415 (2015). [CrossRef]
62. C. Curatu, G. Curatu, and J. Rolland, “Fundamental and specific steps in Shack-Hartmann wavefront sensor design,” Proc. SPIE 6288, 1–9 (2006). [CrossRef]
63. R. Rammage, D. Neal, and R. Copland, “Application of Shack-Hartmann wavefront sensing technology to transmissive optic metrology,” Proc. SPIE 4779, 161–172 (2002). [CrossRef]
64. L. N. Thibos, R. A. Applegate, J. T. Schwiegerling, and R. Webb, “Standards for reporting the optical aberrations of eyes,” J. Refract. Surg.18(5), S652–S660 (2002).
65. J. W. Goodman, Introduction to Fourier Optics, 4th ed. (W. H. Freeman and Company, 2017).
66. C. Kittel, Introduction to Solid State Physics, 5th ed. (Wiley, 1976).
67. V. Akondi and A. Dubra, “Multi-layer Shack-Hartmann wavefront sensing in the point source regime,” Biomed. Opt. Express 12(1), 409–432 (2021). [CrossRef]