Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Single-shot hybrid photoacoustic-fluorescent microendoscopy through a multimode fiber with wavefront shaping

Open Access Open Access

Abstract

We present a minimally-invasive endoscope based on a multimode fiber that combines photoacoustic and fluorescence sensing. From the measurement of a transmission matrix during a prior calibration step, a focused spot is produced and raster-scanned over a sample at the distal tip of the fiber by use of a fast spatial light modulator. An ultra-sensitive fiber-optic ultrasound sensor for photoacoustic detection placed next to the fiber is combined with a photodetector to obtain both fluorescence and photoacoustic images with a distal imaging tip no larger than 250 µm. The high signal-to-noise ratio provided by wavefront shaping based focusing and the ultra-sensitive ultrasound sensor enables imaging with a single laser shot per pixel, demonstrating fast two-dimensional hybrid in vitro imaging of red blood cells and fluorescent beads.

Published by The Optical Society under the terms of the Creative Commons Attribution 4.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.

1. Introduction

Photoacoustic imaging is a promising modality for numerous biomedical applications [1]. It provides a contrast based on optical absorption and is either used as a label-free method to image endogenous light absorbing molecules in tissues, such as hemoglobin, melanin and lipids, or combined with exogenous contrast agents to target specific structures. When absorbers are illuminated by light of time-varying intensity (such as provided by pulsed or modulated laser sources), photoacoustic waves are generated through thermoelasticity. The photoacoustic waves can then be detected by ultrasound detectors outside the sample. This modality may also provides depth information for 3D imaging. Two regimes can be distinguished for photoacoustic imaging: in the first, limited to superficial depth because of optical scattering, a focused laser beam is scanned within the sample and an image is formed point-by-point with optical resolution; in the second, the sample is excited with wide-field illumination and an image is reconstructed deep in tissue with acoustic resolution from the detection of unscattered photoacoustic waves at multiple locations. For both regimes, the spatial resolution of photoacoustic images decreases with increasing depth, due to light scattering for the optical-resolution regime, or sound attenuation for the acoustic resolution regime. In photoacoustic imaging, the depth-to-resolution ratio is limited to around 200, e.g., a 1 µm resolution image can only be obtained at depth as shallow as 200 µm.

In this context, photoacoustic endoscopy appears as an alternative approach to retain high imaging resolution at depth, at the cost of invasiveness which should thus be minimized. A number of photoacoustic endoscopic probes have been reported providing either acoustic-resolution [26] or optical-resolution [714] imaging. They generally include a fiber to bring the laser light to the imaging tip, a scanning mirror, an ultrasonic transducer, and, in the case of optical-resolution systems, a graded-index (GRIN) lens and/or fiber bundle to focus the excitation light. The footprint of such probes is usually in the millimeter range. Although this is adequate for a number of biomedical applications (such as intravascular [5,6,9], gastrointestinal tract [2,3] or ovarian [4] imaging), the insertion of a mm-sized probe would induce detrimental tissue damage in other cases, e.g., when studying neuro-vascular activity in the brain of small animals. In such cases, reducing the footprint of photoacoustic endoscope probes is an important requirement.

Fiber bundles are most commonly used to performed endoscopy with optical resolution, but their footprint remain on the order of a millimeter. For an equivalent information content, multimode fibers (MMF) have a much smaller diameter (typically 20 to 30 times smaller in area) than fiber bundles and have been implemented as ultrathin endoscope probes in various all-optical modalities, such as reflection [15,16] and confocal [17] microscopy, optical trapping [18], fluorescence [1925], Raman [26], two-photon [27] and light-sheet [28] microscopy. MMF have also been used in photoacoustic imaging to deliver excitation light, combined with an external ultrasound transducer [29,30]. The guided modes in MMF have different phase velocities, and consequently coherent light injected into a MMF most generally emerges as speckle patterns. Although speckle illumination may be exploited directly in some cases [16,31], most applications require to image the sample by scanning a focused excitation spot at the MMF output, which can be obtained by wavefront shaping. Initially developed to focus light through scattering media [32], wavefront shaping consists in manipulating the complex wavefront injected into a MMF in order to obtain a given output field pattern. It is based on the existence of a linear and deterministic relation between incident and transmitted fields, which needs to be evaluated. Several techniques has been employed to implement wavefront shaping, such as iterative optimization [3335], digital phase conjugation [27,36], or transmission matrix measurement [15,17,37]. The input wavefront phase modulation may be performed using either liquid-crystal spatial light modulators (LC-SLM) or, more recently, digital micro-mirror devices (DMD) [21], which have the advantage of speed but at the cost of light efficiency and are binary amplitude modulation devices [38].

Recently, ultrathin optical-resolution photoacoustic endoscopy has been demonstrated with both capillary waveguide [39] and a MMF [31]. In Ref. [39], the excitation light guided through a 350 µm-wide silica capillary tube is focused by digital phase conjugation at the endoscope distal tip, while the capillary core, filled with water, carries back the acoustic waves towards the endoscope input where a transducer is placed. However, only relatively large objects (such as nylon threads) have been imaged, and the length of the device is limited to a few cm by sound attenuation in the water core. In Ref. [31], a hybrid photoacoustic-fluorescence endoscope based on a MMF for light excitation and a fiber ultrasound sensor for acoustic detection was presented. The proposed approach involved illuminating the sample with a set of ‘natural’ speckles obtained at the MMF output by projecting random intensity patterns at its input, using a digital mirror device (DMD). Photoacoustic and fluorescence signals are detected for each of these known speckles and an image is reconstructed by solving an inverse problem using a sparsity constraint. This strategy is simple to implement and can be combined with compressive sensing to reduce the number of measurements relative to the number of pixels in the final image. However, since the excitation power is spread over the whole field of view, the detected signals have a low signal-to-noise ratio (SNR) and a large number of laser pulses need to be averaged, which limits the imaging speed. In addition, the possibility to use compressive sensing methods is limited to sparse samples.

Previous ultrathin photoacoustic endoscopes presented relatively low SNR, requiring the acoustic signals to be averaged over many laser pulses, at the cost of long acquisition times and consequently a very low temporal resolution. Here we present an alternative MMF-based photoacoustic endoscopy approach, to improve measurement SNR and imaging speed. Using wavefront shaping, the excitation laser is focused at the distal tip of the MMF, which enhances the power density by several orders of magnitude, compared to speckle illumination [31]. Such focus can then be raster-scanned to image the sample. The transmission matrix (TM) method is used to determine the MMF light-transport properties (as opposed to digital phase conjugation in Ref. [39]): this technique is both easy to implement and versatile since, once measured, the TM enables one to create any pattern at the fiber output. Another crucial aspect is the acoustic detection: we use a fiber-optic ultrasound sensor (FOS) that consists of a single-mode optical fiber containing a polymer microcavity, acting as a Fabry-Pérot interferometer, at its tip. Such a FOS is particularly attractive for our approach as its footprint is the one of a single-mode fiber. As the FOS and the MMF are set next to each other with the sample close to the MMF tip, the acoustic waves propagated from the sample to the FOS with large angles. Compared to the commercial FOS used in Ref. [31], the novel FOS used here relies on a plano-concave polymer microresonator and offers a much higher sensitivity and much larger angular acceptance over its whole bandwidth.

In this work, we demonstrate that, by combining wavefront-shaping to focus through a MMF and an ultra-sensitive FOS, photoacoustic imaging can be performed with a single laser pulse per pixel, through a 250$\times$125 µm$^2$ footprint endoscope. This is an important step towards fast photoacoustic endoscopy. In addition, we show that fluorescence imaging can be performed with the exact same setup enhanced with a photomultiplier for fluorescence detection. Combining photoacoustic and fluorescence imaging into one single device is particularly attractive to offer a flexible imaging tool sensitive to different optical properties. For instance, vascular dynamics can be measured using photoacoustic imaging while neuronal activity can be monitored using the fluorescence signal emitted by calcium indicators. As a proof-of-concept, this bimodal endoscope is used here to image red blood cells and fluorescent beads.

2. Experimental methods

2.1 Hybrid photoacoustic-fluorescence probe

The endoscope probe consists of a multimode fiber (MMF), used to both guide the excitation light and collect fluorescence signal, and a fiber-optic ultrasound sensor (FOS), placed side-by-side, used to measure the photoacoustic signals from the sample. The MMF and the FOS are held together inside a cannula with about $\sim$15 mm of both fibers emerging out of the cannula . Its associated total footprint is 250$\times$125 µm$^2$, as shown in Fig. 1(b-c). The MMF is a $\sim$8 cm-long segment of 62.5 µm-core and 125 µm-cladding graded-index (GRIN) fiber. GRIN fibers, due to weaker mode coupling compared to step-index fibers, exhibit light transport properties that are more robust to deformation/bending [22,40]. The focus created through such a fiber is thus expected to be more stable.

 figure: Fig. 1.

Fig. 1. (a) Experimental setup. SLM: Spatial Light Modulator, (P)BS: (Polarizing) Beam Splitter. L: Lens, P: Polarizer, QWP: Quarter Wave Plate. F: Filter. MMF: Multimode Fiber. FOS: Fiber-Optic ultrasound Sensor. DM: Dichroic Mirror. (b) Zoom of the setup at the endoscope tip in presence of the sample. PBS: Phosphate-buffered saline. Dimensions in micrometer. (c) Photography of the endoscope tip.t is then focused by an aspheric lens

Download Full Size | PDF

The FOS is based on a single mode fiber (10 µm core diameter and 125 µm cladding) with a plano-concave polymer microresonator on its tip [41]. Acoustic waves incident on the cavity modulate its optical thickness and thus its reflectivity which is detected using an interrogation laser beam coupled into the fibre. The sensor provides a wide acoustic bandwidth of 1-40MHz. This new design offers a strong optical confinement which leads, compared to a planar cavity sensor, to a significantly larger sensitivity as well as wider directivity [41,42]. For a 65 µm-wide sample located 100 µm away from the endoscope tip and aligned with the MMF axis (as in the following experiments), acoustic waves are incident on the microcavity with angles up to $\sim$60$^{\circ }$ from normal incidence. This novel plano-concave cavity exhibits effective omnidirectionnality with high sensitivity up to $\pm$90$^{\circ }$ over the whole 1-40 MHz bandwidth, while it is non-uniform for frequencies above 10 MHz for a planar cavity sensor (with drops up to 25 dB) [42]. The combination of high sensitivity and omnidirectionality provided by the plano-concave cavity design results in high acoustic SNR thus enabling single-shot photoacoustic detection at each illumination point without signal averaging.

2.2 Experimental setup

The experimental setup is presented in Fig. 1(a). Light pulses delivered by a $\lambda =532$ nm Q-switched laser (Cobolt Tor 400) with a repetition frequency of $f_{\textrm {rep}}=7$ kHz and a pulse duration of $\sim$5 ns, is divided into two beams. One beam is used as the reference beam for calibration (see ‘calibration part’ in Fig. 1(a)). The other beam provides the excitation light for photoacoustic and fluorescence imaging. The excitation beam is first expanded (through $L1$ and $L2$ lenses, Fig. 1) and sent to a fast LC-SLM for phase modulation (Meadowlark HSP512L-532, 512$\times$512 pixels, 380 Hz refreshment rate) at normal incidence by a beamsplitter cube (50:50). The beam transmitted by the cube is reflected by a dichroic mirror. Then, after going through a telescope ($f_{L3}$=100 mm and $f_{L4}$=50 mm-focal length achromatic doublets), the beam is circularly polarized by a quarter-wave plate, as circular polarization is well-preserved when propagating through a straight fiber. It is then focused by an aspheric lens (Thorlabs AL12106-A, $f_{L5}$=10 mm) into a 8 cm-long section of multimode fiber (Thorlabs GIF625-10). The output facet of the MMF and the FOS tip, held next to each other, are immersed in water or phosphate-buffered saline (PBS), and the sample surface is placed at a distance of $\sim$100 µm below the probe. This distance is arbitrarily chosen as a good compromise between reducing the angle between the acoustic waves generated from the RBCs and the axis of the FOS, while being close enough to maintain a high numerical aperture (NA) to achieve good focusing (see Sec. 3.1).

The sample consists of red blood cells (RBCs) and fluorescent beads (1 µm diameter, Invitrogen F8851) in a glass-bottom well and is prepared in the following way. First, an aqueous solution of fluorescent beads is deposited on the glass slide at the bottom of a clean well. After a few minutes, the solution is air-dried leaving only the beads adsorbed on the glass surface. Then the solution of RBCs (blood diluted $\sim$6000 times) in PBS is deposited in the well and the RBCs are left to sediment before acquisition. Hence, both fluorescent beads and RBCs are observed on the bottom surface of the well (see Fig. 1(b)).

Fluorescence light emitted by the beads is collected by the MMF and travels back along the same path until it is transmitted by the dichroic mirror and detected by a photomultiplier tube (PMT, Hamamatsu R647) through a long-pass filter (Chroma ET575lp). The signal from the PMT is then sent to a current preamplifier (Stanford Research Systems SR570) before acquisition. The photoacoustic signal detection is performed as detailed in Ref. [41]. The signals from the PMT and the FOS are then acquired by a USB oscilloscope (TiePie Handyscope HS6) on a computer.

In order to obtain the transmission matrix to further perform wavefront shaping, one needs to measure the complex optical field at the MMF output. The amplitude and phase of the speckle in the output plane can be extracted through off-axis holography [43], i.e., by measuring the interference between the fiber output field and a reference beam. To that end, the output field of the MMF is imaged by an objective lens (20x Mitutoyo Plan Apochromat, NA=0.42) and an achromatic doublet ($f_{L6}$=200 mm). Its circular polarization is converted back into linear polarization. Then it is merged with the reference beam by a non-polarizing beamsplitter. The interference intensity pattern is recorded by a CMOS camera (Basler acA1300-200um, see ‘calibration part’ in Fig. 1(a)). The output field is retrieved, in amplitude and phase, from the interferogram in the Fourier domain by extracting the spatial frequencies around the first diffraction order. During the calibration procedure, the sample is translated so that the output light from the MMF illuminates an area without beads nor RBCs (right side of the sample in Fig. 1(b)).

2.3 Calibration procedure

The MMF is calibrated using the transmission matrix (TM) approach [44]. It consists in measuring the deterministic relation that connects the optical field in some imaging plane at the distal tip of the MMF (expressed in a set of output modes) to the input field at the proximal tip, decomposed on a set of input modes. In our case, the input modes are defined as plane waves in the SLM plane with various $\vec {k}$ directions. The set of input modes in the SLM plane corresponds to focused spots at the input facet of the MMF, defined across a 60$\times$60 orthogonal grid (with 1.17 µm spacing). Retaining only the input spots that fall onto the MMF core reduces the number of effective input modes to approximately $N_{in}=2500$. This roughly corresponds to the number of guided modes in the GRIN MMF if we consider in first approximation this number to be $N_{\textrm {MMF}}=(1/4)\cdot (\pi D \textrm {NA} / \lambda )^2=2600$, with $D$ the MMF core diameter (62.5 µm). The output modes are defined over a grid, with $\sim$1.3 µm spacing in the imaging plane, which is set in a plane 100 µm far from the fiber distal end (named ‘output plane’ in the following). The image field of view is limited to a circular region of $\sim$65 µm diameter, leading to a number of output modes of about $N_{out}=1800$. The size of this region was chosen to be small enough to limit the TM acquisition time and calculation duration and ensure a good focusing quality (see Sec. 3.1). Prior to the acquisition of the transmission matrix, the MMF input facet is centered by an automatic procedure, consisting of iteratively scanning a focused spot in two orthogonal directions of the MMF entrance plane, using the SLM, while monitoring the integrated intensity at the MMF output. The input modes are then defined around this center position. Note that the MMF is positioned so that the specular reflection from the SLM, corresponding to unmodulated light, is focused outside the fiber core and is not guided to the sample, i.e., the MMF axis is slightly shifted from the main optical axis.

Each column of the transmission matrix $K$ is acquired by applying on the SLM a linear phase ramp to produce a plane wave with a given $\vec {k}$, corresponding to an input mode, and by extracting the associated amplitude and phase in the output plane through off-axis holography (see Sec. 2.2) [43]. The complex optical field values at each of the pixels defined as output modes then form one column of the TM. Once this is repeated for all input modes, the TM acquisition is completed, and the relation between the input field $E_{in}$ and the output one $E_{out}$ can be expressed as $E_{out}=K\cdot E_{in}$. Assuming

$$E_{in}=K^\dagger\cdot E_{out},$$
where $^\dagger$ represents the transpose conjugate [44], one can design input fields to obtain any desired output pattern. To raster scan the sample, one needs to generate individual output modes (focused spots). In this case, $E_{out}$ is a basis vector with only one non-zero element. The computed input field, which is combination of the input modes (in the tilted plane waves basis), are defined in amplitude and phase, but only the phase patterns are stored to be displayed on the SLM, since the latter is a phase-only modulator. This entire calibration procedure is performed within 2-3 minutes. In principle, a calibration stays valid as long as the MMF position/deformation and ambient conditions are unchanged. In practice, we found that the TM stays valid for more than an hour under our experimental conditions. After calibration, in order to perform imaging, the sample is translated back so that the MMF illuminates a region with fluorescent beads and red blood cells (RBCs).

3. Results

3.1 Characterization of the focused spots

We first provide a characterization of the focused spots obtained in the imaging plane (100 µm away from the endoscope tip, where the sample is placed). This characterization is of importance because, due to the point-by-point imaging approach, the resolution of both the photoacoustic and fluorescent images are governed by the optical resolution at each focus point. As opposed to the fluorescence image, the temporal resolution on the PA signals can provide 3D images with an axial resolution dictated by the acoustic bandwidth. The axial resolution was however not characterized in this work, limited to 2D samples. The endoscope tip is immersed in a glass-bottom well, identical to the one used for the imaging experiments (Sec. 3.2), filled with distilled water but that does not contain fluorescent beads nor RBCs. An image of the spot is acquired with the CMOS camera for every position in the scanned region (about 68 µm diameter with a 1.3 µm spacing). For each of the $\sim$2000 points, the experimental image of the focus is fitted to the following Gaussian-like profile:

$$F(x,y)=A \exp\left(-\frac{\left[\cos(\theta)x+\sin(\theta)y-x_0\right]^2}{2{\sigma_1}^2} -\frac{\left[-\sin(\theta)x+\cos(\theta)y-y_0\right]^2}{2{\sigma_2}^2}\right),$$
where $x$ and $y$ are the two axis of the scan (see Fig. 2(a)), $A$ is an amplitude term, $\theta$ is the orientation angle of the focus spot, $(x_0,y_0)$ are the coordinates of its fitted center, and $\sigma _{1,2}$ are the two (fitted) Gaussian RMS widths. The full width at half maximum (FWHM) of the focused spot in the two directions is then calculated: $a_{1,2}=2\sqrt {2\textrm {ln}(2)}\sigma _{1,2}$. For each spot, the smaller and larger widths are identified and labeled $a_s$ and $a_\ell$, respectively. Figure 2 presents the maps of $a_s$ (Fig. 2(a)) and $a_\ell$ (Fig. 2(b-c)), as well as the spot orientation ($\theta$, Fig. 2(c)), for every point of the scanned area, as well as three typical examples of the beam profile in the center and in the area rim. In the orientation map, an intensity scale, proportional to the degree of anisotropy (estimated by $r=(a_\ell -a_s)/(a_\ell +a_s)$), is added to the colormap to darken the areas where the spot is more isotropic (i.e., when $r\to 0$).

 figure: Fig. 2.

Fig. 2. (a-c) Mapping of the FWHM and beam orientation of the focused spots as a function of their position. Values extracted from 2D Gaussian-like fit with (a) the small axis $a_s$, (b) the large axis $a_\ell$, (c) the orientation angle combined with a contrast scale bar proportional to $r=(a_\ell -a_s)/(a_\ell +a_s)$. (d-f) Experimental beam profile measured at positions marked in (a-c). The blue lines show the Gaussian fits to the experimental values (crosses).

Download Full Size | PDF

From Fig. 2(a-b), it appears that the beam profile is homogeneous and isotropic over a $\sim$50 µm diameter area with $a_s=1.6\pm 0.1$ and $a_L=1.7\pm 0.1$ µm. While the small radius remains globally constant over the whole scanned area, the large one increases sensitively at the area rim as it can be seen in Fig. 2(b) as well as in Fig. 2(h,l), where $a_\ell /a_s$ varies up to 1.5. The orientation map (Fig. 2(c) reveals the spot is elongated along the radial direction away from the center of the field-of-view (FOV). This anisotropic elongation is caused by a decrease of the effective NA away from the optical axis. This occurs when the imaging plane is at some distance beyond the MMF end face (in our case 100 µm). Indeed, when the focus is on the optical axis, the rays, emerging from the fiber tip, converge onto it in a symmetrical cone, but, when the focus moves away, this cone is cropped on one side by the edge of the fiber core. Hence, the effective NA is reduced in the radial plane and the spot becomes elongated. Near the center of the field of view, the smallest spot size measured is 1.5 µm. However, the NA of the incident beam (NA$\sim$0.25), lower than the nominal NA of the GRIN fiber (0.29), leads in first approximation to a diffraction-limited spot of FWHM $\lambda /(2\textrm {NA})\simeq 1.1$ µm. Even in the optical axis, the NA reduces with distance from the MMF tip. Since the NA decreases with the radial position in GRIN fibers, light rays emerging from the fiber at larger distances from the center will subtend a narrower range of angles. Considering a focus on the optical axis at some distance beyond the fiber end face, the effective NA would be limited by the more tilted rays emerging from the fiber that can reach the focus. For a plane at a distance of 100 µm, a maximum NA of 0.25 and a fiber with perfectly parabolic index profile, the effective NA is estimated to be 0.19, leading to a spot FWHM of $\sim$1.4 µm which is close to our experimental value. In the imaging experiments in the next section, the imaging diameter area is limited to $\sim$64 µm to avoid the more anisotropic cases at the disc rim. The intensity of the focal spot was the highest around the fiber axis, and decreased by about a factor 2 towards the periphery of the FOV. The maximum focusing efficiency, defined as the ratio between the energy in the focal spot and the total energy in the focal plane, was on the order of $30 \%$. The enhancement factor, defined as ratio between the focal intensity and the mean background value was typically of the order of 500.

3.2 Imaging results

Once the calibration is achieved, one can focus on a RBC or a fluorescent bead to acquire a photoacoustic and a fluorescence signal, respectively, upon absorption of the 532 nm laser pulse. Figure 3(a) and (b) present a typical example of both these signals, generated after a single laser pulse. (The time $t=0$ indicates when the laser trigger is detected by the USB oscilloscope, but a delay is observed on both photoacoustic and fluorescent signals due to the amplifiers used in each channel. Moreover the fluorescence pulse is also lengthened by the amplifier at the output of the photomultiplier tube.) Fig. 3(c) is a bright-field image of the raster-scanned area, acquired by the CMOS camera (see Fig. 1) with an incoherent light source. This proof-of-principle sample contains RBCs (two in the middle and a partially included one in the bottom right part) of $\sim$6 µm in diameter and several 1 µm-fluorescent beads (about twenty). The sample is scanned twice over $\sim$1800 points spaced by 1.3 µm: a first time at low laser pulse energy for fluorescence imaging (total energy per pulse transmitted through the fiber $E_{\mathrm {pulse}}\sim 20$ nJ) , and a second time at higher energy per pulse for photoacoustic imaging ($E_{\mathrm {pulse}}\simeq 400$ nJ). The corresponding fluence at the focal spots was on the order of a few hundred mJ/cm$^2$ for fluorescence imaging and a few J/cm$^2$ for photoacoustic imaging. This double acquisition is required because of the difference between the excitation energy for fluorescence, which should be low enough to avoid optical saturation of the fluorophores and photobleaching, and the one required for single RBC to generate acoustic waves strong enough to be detected. Optical saturation of the fluorophores was observed but no photobleaching was induced, and photoacoustic scanning could be performed after fluorescence scanning.

 figure: Fig. 3.

Fig. 3. (a) Fluorescent signal acquired on a fluorescent bead and measured background in data. (b) Acoustic signal acquired on a red blood cell. (a-b) $t=0$ corresponds to the laser trigger detection. (c) Bright-field image of the sample. RBC: Red Blood Cell, FB: Fluorescent Bead. Dotted line limits the scanned area. (d) Fluorescent (e) Photoacoustic and (f) Composite image of the maximum value acquired with single-shot per pixel illumination.

Download Full Size | PDF

We can observe that, thanks to the high power density in the focused spot and the high sensitivity to large angles afforded by the FOS detector, both fluorescence and photoacoustic signals display a high SNR. This confirms the possibility to perform fluorescence and photoacoustic endoscopy with a single-shot per pixel, as shown by the images of Fig. 3(d)-(f).

Figure 3(d) displays the image obtained by taking the maximum of the fluorescence signal acquired at each scan position after background subtraction. The background (shown in Fig. 3(a)) is estimated by averaging the signal in regions with no beads. Its origin is likely fluorescence from the fibers’ polymer coating. Although we removed the coating of the MMF over its entire length, the single mode fiber (of the FOS) that is adjacent to the MMF was only stripped over 1-2 centimeters and the remaining polymer coating could be excited by light leaking from the MMF. It was checked that this background disappears in the absence of the FOS. Thus, this background signal was subtracted from each signal before extracting the maximum value in Fig. 3(d). Figure 3(e) shows the photoacoustic image obtained by displaying the maximum absolute value of the signal measured by the FOS at each scan position. Single RBCs are clearly detected with a large SNR and a noticeable absence of background in this image. Obtaining a photoacoustic image with such an important SNR and in single-shot per pixel on single red blood cells demonstrates the potential of our new endoscope and is very promising for medical applications. Finally, in Fig. 3(f) both fluorescent and photoacoustic signals are combined in an image showing the two modalities. The comparison with the brightfield image (Fig. 3(c)) exhibits a good agreement over the whole image.

From these results, the apparent size of the fluorescent beads and of the RBCs can be evaluated. As the former ones (1 µm diameter) are smaller than the expected spatial resolution ($1.7\pm 0.1$ µm for the larger width, see Sec. 2.3), single bead images are fit with a 2D Gaussian function. It leads to an estimation of the bead mean FWHM of $2.2$ µm. This is in qualitative agreement with the expected value, although it is slightly larger. This difference could be attributed to a small defocus of the beads relative to the calibration plane, set for RBCs. The experimental FWHM of the latter can also be measured from this experiment; it yields a diameter varying from 5.8 to 6.1 µm, in agreement with the expected size of RBCs.

4. Conclusion

A microendoscope with a footprint of only 250$\times$125 µm$^2$ performing both photoacoustic and fluorescent imaging with a single-shot illumination per pixel is demonstrated. It relies on a MMF to guide the excitation light and collect the fluorescent signal while an adjacent fiber-optic ultrasound sensor detects photoacoustic waves. Using the transmission matrix approach and a fast LC-SLM, the output field pattern can be controlled through wavefront shaping. We exploit this method to scan a focused spot in an imaging plane located 100 µm beyond the MMF end face. This configuration necessarily induces inhomogeneities of the spot size and shape across the whole illuminated region. However, we managed to focus into a $1.7\pm 0.3$ µm FWHM spot throughout a circular area of $\sim$65 µm diameter, allowing to perform raster-scan imaging. For each focus point, a single-shot laser excitation is enough to provide a fluorescence and a photoacoustic signal with a large SNR. We demonstrate imaging in both modalities by detecting multiple 1 µm fluorescent beads and $\sim$6 µm red blood cells, which could both be resolved individually.

To our knowledge, these results reports the first realization of a photoacoustic imaging with single-shot per pixel in minimally invasive endoscopy imaging. This is obtained by combining wavefront shaping techniques and a highly sensitive fiber-optic ultrasound sensor. Single-shot acquisition brings on substantial improvement in terms of acquisition time. In the present setup, the latter was limited by the acquisition system (USB oscilloscope) which can sustain a maximum trigger rate of about 60 Hz, corresponding to the acquisition of one image in about 30 seconds. This system already exhibits the fastest acquisition rate attained in photoacoustic endoscopy, compared with current literature ($\sim$3 Hz in Ref. [39], and 22 Hz in Ref. [31]). A better acquisition card would allow performing similar imaging at 380 Hz (the LC-SLM refresh rate). At such speed, one image would be obtained in 5 seconds. A similar approach could also be achieved with a DMD which exhibits a 20 kHz switching rate. The acquisition would then be limited by the laser repetition rate only, which is 7 kHz in the present setup, leading to 4 images per second. Comparison of the images obtained with both modulator devices would also be of interest. An image with fewer points ($\sim$1800 in the present experiment) either by reducing the scanned area and/or the image resolution, would obviously offer an even faster framerate if required by the targeted application.

This type of ultrathin endoscope, capable of fast photoacoustic and fluorescence imaging, is promising for in vivo applications such as deep brain functional imaging, as it enables to monitor simultaneously and in real-time vascular dynamics (through the photoacoustic signal) and neuronal activity using fluorescence reporters.

Funding

H2020 European Research Council (ERC Consolidator Grant 681514, ERC Advanced Grant 741149); H2020 Marie Skłodowska-Curie Actions (DARWIN, 750420); Engineering and Physical Sciences Research Council (NS/A000050/1).

Acknowledgments

This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (ERC Consolidator Grant 681514 and ERC Advanced Grant 741149) and Wellcome/EPSRC Centre for Surgical and Interventional Sciences (Ref. NS/A000050/1). A.M. Caravaca-Aguirre was supported by the Marie Skłodowska-Curie Individual Fellowship (Project DARWIN, Grant Agreement No. 750420).

Disclosures

The authors declare no conflicts of interest.

References

1. L. V. Wang and J. Yao, “A practical guide to photoacoustic tomography in the life sciences,” Nat. Methods 13(8), 627–638 (2016). [CrossRef]  

2. J.-M. Yang, R. Chen, C. Favazza, J. Yao, C. Li, Z. Hu, Q. Zhou, K. K. Shung, and L. V. Wang, “A 2.5-mm diameter probe for photoacoustic and ultrasonic endoscopy,” Opt. Express 20(21), 23944–23953 (2012). [CrossRef]  

3. J.-M. Yang, C. Li, R. Chen, Q. Zhou, K. K. Shung, and L. V. Wang, “Catheter-based photoacoustic endoscope,” J. Biomed. Opt. 19(6), 066001 (2014). [CrossRef]  

4. Y. Yang, X. Li, T. Wang, P. D. Kumavor, A. Aguirre, K. K. Shung, Q. Zhou, M. Sanders, M. Brewer, and Q. Zhu, “Integrated optical coherence tomography, ultrasound and photoacoustic imaging for ovarian tissue characterization,” Biomed. Opt. Express 2(9), 2551–2561 (2011). [CrossRef]  

5. X. Ji, K. Xiong, S. Yang, and D. Xing, “Intravascular confocal photoacoustic endoscope with dual-element ultrasonic transducer,” Opt. Express 23(7), 9130–9136 (2015). [CrossRef]  

6. J. Hui, Y. Cao, Y. Zhang, A. Kole, P. Wang, G. Yu, G. Eakins, M. Sturek, W. Chen, and J.-X. Cheng, “Real-time intravascular photoacoustic-ultrasound imaging of lipid-laden plaque in human coronary artery at 16 frames per second,” Sci. Rep. 7(1), 1417 (2017). [CrossRef]  

7. P. Hajireza, W. Shi, and R. J. Zemp, “Label-free in vivo fiber-based optical-resolution photoacoustic microscopy,” Opt. Lett. 36(20), 4107 (2011). [CrossRef]  

8. P. Hajireza, W. Shi, and R. Zemp, “Label-free in vivo GRIN-lens optical resolution photoacoustic micro-endoscopy,” Laser Phys. Lett. 10(5), 055603 (2013). [CrossRef]  

9. X. Bai, X. Gong, W. Hau, R. Lin, J. Zheng, C. Liu, C. Zeng, X. Zou, H. Zheng, and L. Song, “Intravascular optical-resolution photoacoustic tomography with a 1.1 mm diameter catheter,” PLoS One 9(3), e92463 (2014). [CrossRef]  

10. B. Dong, S. Chen, Z. Zhang, C. Sun, and H. F. Zhang, “Photoacoustic probe using a microring resonator ultrasonic sensor for endoscopic applications,” Opt. Lett. 39(15), 4372–4375 (2014). [CrossRef]  

11. H. Guo, C. Song, H. Xie, and L. Xi, “Photoacoustic endomicroscopy based on a MEMS scanning mirror,” Opt. Lett. 42(22), 4615–4618 (2017). [CrossRef]  

12. N. Liu, S. Yang, and D. Xing, “Photoacoustic and hyperspectral dual-modality endoscope,” Opt. Lett. 43(1), 138–141 (2018). [CrossRef]  

13. L. Xi, C. Duan, H. Xie, and H. Jiang, “Miniature probe combining optical-resolution photoacoustic microscopy and optical coherence tomography for in vivo microcirculation study,” Appl. Opt. 52(9), 1928–1931 (2013). [CrossRef]  

14. J.-M. Yang, C. Li, R. Chen, B. Rao, J. Yao, C.-H. Yeh, A. Danielli, K. Maslov, Q. Zhou, K. K. Shung, and L. V. Wang, “Optical-resolution photoacoustic endomicroscopy in vivo,” Biomed. Opt. Express 6(3), 918–932 (2015). [CrossRef]  

15. Y. Choi, C. Yoon, M. Kim, T. D. Yang, C. Fang-Yen, R. R. Dasari, K. J. Lee, and W. Choi, “Scanner-free and wide-field endoscopic imaging by using a single multimode optical fiber,” Phys. Rev. Lett. 109(20), 203901 (2012). [CrossRef]  

16. R. N. Mahalati, R. Y. Gu, and J. M. Kahn, “Resolution limits for imaging through multi-mode fiber,” Opt. Express 21(2), 1656–1668 (2013). [CrossRef]  

17. D. Loterie, S. Farahi, I. Papadopoulos, A. Goy, D. Psaltis, and C. Moser, “Digital confocal microscopy through a multimode fiber,” Opt. Express 23(18), 23845–23858 (2015). [CrossRef]  

18. I. T. Leite, S. Turtaev, X. Jiang, M. Šiler, A. Cuschieri, P. S. J. Russell, and T. Čižmár, “Three-dimensional holographic optical manipulation through a high-numerical-aperture soft-glass multimode fibre,” Nat. Photonics 12(1), 33–39 (2018). [CrossRef]  

19. T. Čižmár and K. Dholakia, “Exploiting multimode waveguides for pure fibre-based imaging,” Nat. Commun. 3(1), 1027 (2012). [CrossRef]  

20. I. N. Papadopoulos, S. Farahi, C. Moser, and D. Psaltis, “High-resolution, lensless endoscope based on digital scanning through a multimode optical fiber,” Biomed. Opt. Express 4(2), 260–270 (2013). [CrossRef]  

21. A. M. Caravaca-Aguirre, E. Niv, D. B. Conkey, and R. Piestun, “Real-time resilient focusing through a bending multimode fiber,” Opt. Express 21(10), 12881–12887 (2013). [CrossRef]  

22. A. M. Caravaca-Aguirre and R. Piestun, “Single multimode fiber endoscope,” Opt. Express 25(3), 1656 (2017). [CrossRef]  

23. S. Ohayon, A. M. Caravaca-Aguirre, R. Piestun, and J. J. Di Carlo, “Minimally invasive multimode optical fiber microendoscope for deep brain fluorescence imaging,” Biomed. Opt. Express 9(4), 1492–1509 (2018). [CrossRef]  

24. S. Turtaev, I. T. Leite, T. Altwegg-Boussac, J. M. P. Pakan, N. L. Rochefort, and T. Čižmár, “High-fidelity multimode fibre-based endoscopy for deep brain in vivo imaging,” Light: Sci. Appl. 7(1), 92 (2018). [CrossRef]  

25. S. A. Vasquez-Lopez, R. Turcotte, V. Koren, M. Plöschner, Z. Padamsey, M. J. Booth, T. Čižmár, and N. J. Emptage, “Subcellular spatial resolution achieved for deep-brain imaging in vivo using a minimally invasive multimode fiber,” Light: Sci. Appl. 7(1), 110 (2018). [CrossRef]  

26. I. Gusachenko, M. Chen, and K. Dholakia, “Raman imaging through a single multimode fibre,” Opt. Express 25(12), 13782–13798 (2017). [CrossRef]  

27. E. E. Morales-Delgado, D. Psaltis, and C. Moser, “Two-photon imaging through a multimode fiber,” Opt. Express 23(25), 32158–32170 (2015). [CrossRef]  

28. M. Plöschner, V. Kollárová, Z. Dostál, J. Nylk, T. Barton-Owen, D. E. K. Ferrier, R. Chmelík, K. Dholakia, and T. Čižmár, “Multimode fibre light-sheet microscopy at the tip of a needle,” Sci. Rep. 5(1), 18050 (2015). [CrossRef]  

29. I. N. Papadopoulos, O. Simandoux, S. Farahi, J.-P. Huignard, E. Bossy, D. Psaltis, and C. Moser, “Optical-resolution photoacoustic microscopy by use of a multimode fiber,” Appl. Phys. Lett. 102(21), 211106 (2013). [CrossRef]  

30. O. Simandoux, N. Stasio, J. Gateau, J.-P. Huignard, C. Moser, D. Psaltis, and E. Bossy, “Optical-resolution photoacoustic imaging through thick tissue with a thin capillary as a dual optical-in acoustic-out waveguide,” Appl. Phys. Lett. 106(9), 094102 (2015). [CrossRef]  

31. A. M. Caravaca-Aguirre, S. Singh, S. Labouesse, M. V. Baratta, R. Piestun, and E. Bossy, “Hybrid photoacoustic-fluorescence microendoscopy through a multimode fiber using speckle illumination,” APL Photonics 4(9), 096103 (2019). [CrossRef]  

32. I. M. Vellekoop and A. P. Mosk, “Focusing coherent light through opaque strongly scattering media,” Opt. Lett. 32(16), 2309–2311 (2007). [CrossRef]  

33. R. Di Leonardo and S. Bianchi, “Hologram transmission through multi-mode optical fibers,” Opt. Express 19(1), 247–254 (2011). [CrossRef]  

34. T. Čižmár and K. Dholakia, “Shaping the light transmission through a multimode optical fibre: complex transformation analysis and applications in biophotonics,” Opt. Express 19(20), 18871–18884 (2011). [CrossRef]  

35. R. Y. Gu, R. N. Mahalati, and J. M. Kahn, “Noise-reduction algorithms for optimization-based imaging through multi-mode fiber,” Opt. Express 22(12), 15118–15132 (2014). [CrossRef]  

36. I. N. Papadopoulos, S. Farahi, C. Moser, and D. Psaltis, “Focusing and scanning light through a multimode optical fiber using digital phase conjugation,” Opt. Express 20(10), 10583 (2012). [CrossRef]  

37. M. Plöschner, B. Straka, K. Dholakia, and T. Čižmár, “GPU accelerated toolbox for real-time beam-shaping in multimode fibres,” Opt. Express 22(3), 2933–2947 (2014). [CrossRef]  

38. S. Turtaev, I. T. Leite, K. J. Mitchell, M. J. Padgett, D. B. Phillips, and T. Čižmár, “Comparison of nematic liquid-crystal and DMD based spatial light modulation in complex photonics,” Opt. Express 25(24), 29874–29884 (2017). [CrossRef]  

39. N. Stasio, A. Shibukawa, I. N. Papadopoulos, S. Farahi, O. Simandoux, J.-P. Huignard, E. Bossy, C. Moser, and D. Psaltis, “Towards new applications using capillary waveguides,” Biomed. Opt. Express 6(12), 4619–4631 (2015). [CrossRef]  

40. D. E. Boonzajer Flaes, J. Stopka, S. Turtaev, J. F. de Boer, T. Tyc, and T. Čižmár, “Robustness of light-transport processes to bending deformations in graded-index multimode waveguides,” Phys. Rev. Lett. 120(23), 233901 (2018). [CrossRef]  

41. J. A. Guggenheim, J. Li, T. J. Allen, R. J. Colchester, S. Noimark, O. Ogunlade, I. P. Parkin, I. Papakonstantinou, A. E. Desjardins, E. Z. Zhang, and P. C. Beard, “Ultrasensitive plano-concave optical microresonators for ultrasound sensing,” Nat. Photonics 11(11), 714–719 (2017). [CrossRef]  

42. P. Morris, A. Hurell, A. Shaw, E. Z. Zhang, and P. C. Beard, “A Fabry–Pérot fiber-optic ultrasonic hydrophone for the simultaneous measurement of temperature and acoustic pressure,” J. Acoust. Soc. Am. 125(6), 3611–3622 (2009). [CrossRef]  

43. E. Cuche and C. Depeursinge, “Spatial filtering for zero-order and twin-image elimination in digital off-axis holography,” Appl. Opt. 39(23), 4070–4075 (2000). [CrossRef]  

44. S. M. Popoff, G. Lerosey, M. Fink, A. C. Boccara, and S. Gigan, “Measuring the transmission matrix in optics: An approach to the study and control of light propagation in disordered media,” Phys. Rev. Lett. 104(10), 100601 (2010). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (3)

Fig. 1.
Fig. 1. (a) Experimental setup. SLM: Spatial Light Modulator, (P)BS: (Polarizing) Beam Splitter. L: Lens, P: Polarizer, QWP: Quarter Wave Plate. F: Filter. MMF: Multimode Fiber. FOS: Fiber-Optic ultrasound Sensor. DM: Dichroic Mirror. (b) Zoom of the setup at the endoscope tip in presence of the sample. PBS: Phosphate-buffered saline. Dimensions in micrometer. (c) Photography of the endoscope tip.t is then focused by an aspheric lens
Fig. 2.
Fig. 2. (a-c) Mapping of the FWHM and beam orientation of the focused spots as a function of their position. Values extracted from 2D Gaussian-like fit with (a) the small axis $a_s$, (b) the large axis $a_\ell$, (c) the orientation angle combined with a contrast scale bar proportional to $r=(a_\ell -a_s)/(a_\ell +a_s)$. (d-f) Experimental beam profile measured at positions marked in (a-c). The blue lines show the Gaussian fits to the experimental values (crosses).
Fig. 3.
Fig. 3. (a) Fluorescent signal acquired on a fluorescent bead and measured background in data. (b) Acoustic signal acquired on a red blood cell. (a-b) $t=0$ corresponds to the laser trigger detection. (c) Bright-field image of the sample. RBC: Red Blood Cell, FB: Fluorescent Bead. Dotted line limits the scanned area. (d) Fluorescent (e) Photoacoustic and (f) Composite image of the maximum value acquired with single-shot per pixel illumination.

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

E i n = K E o u t ,
F ( x , y ) = A exp ( [ cos ( θ ) x + sin ( θ ) y x 0 ] 2 2 σ 1 2 [ sin ( θ ) x + cos ( θ ) y y 0 ] 2 2 σ 2 2 ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.