Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Development of a blurred spectral images restoration technology for CTIS imaging spectrometer

Open Access Open Access

Abstract

The platform vibration will cause spectral images blurring when using computed tomographic imaging spectrometer (CTIS) on unstable platform, especially in space-borne or ship-based application. As a result, a phenomenon of blur-induced spectral distortion occurs, which greatly affects the accuracy of classification of target. However, without easy accessibility of the knowledge of image motion kernel, it is difficult to restore the degraded images accurately. With the desire to solve the motion function, this paper has made a breakthrough by proposing a motion detection method using short exposure images from zeroth order beam light of 2D-grating. Contributed by the accurate motion kernel, blurred spectral images could be reliably restored as well as the improvement of spectra accuracy. Laboratory experiment result shows the robust spectra restoration capability of this technique.

© 2016 Optical Society of America

1. Introduction

Computed tomographic imaging spectrometer (CTIS) has recently succeeded in astronomical object remote sensing [1] and becomes increasingly important. With its advantages of flash hyperspectral imaging, CTIS also shows great potential in space application. After capturing a single snapshot, it provides the possibility of acquiring the hyperspectral cube of observing region simultaneously. However, the attitude of spaceborne platform is not stable due to the satellite jitter, which will result in the blur of spectral images.

In order to increase the signal-to-noise ratio (SNR) of the hyperspectral datacube, CTIS imaging spectrometer often takes several hundred milliseconds to capturing a single snapshot. However, on-orbit test data of Landsat satellite in 1993 shows that the satellite suffers three main disturbances, which are typically solar array motion (0.005° rms at 1 Hz), and reaction wheel fundamental and second harmonic motion (0.0002° rms at 100 Hz and 3 × 10−5° rms at 200 Hz, respectively) [2]. Therefore, the image blur kernel is more likely to be one dimensional gaussian pattern, whose direction is mainly affected by the position of the solar array on satellite [3]. Although the long term attitude stability of many satellite platforms is higher and often at the scale of 0.001°, stable attitude remains important for CTIS spectrometer to observe nadir objects on the surface of the earth. In general, high orbit is suitable for staring imaging system for its low relative motion speed between satellite and earth surface. In the case of application with a ground sample resolution (GSD) of 10 meters in geosynchronous orbit, image blur of several pixels will appear. While in lower orbit, the objects move much faster, leading to image motion of tens of pixels.

Figure 1 shows two images from CTIS imaging spectrometer with a color CMOS sensor captured at the time when the object is stationary and moving, respectively. The spectra in the same region of these two images are also presented.

 figure: Fig. 1

Fig. 1 Spectrum distortion resulted from platform jitter. (a) The sharp zeroth order image of a painting captured in motionless platform. (b) The blurred zeroth order image of the painting captured in moving platform. (c) The real spectrum and blurred spectrum of the pixel at the coordinate of (79, 100).

Download Full Size | PDF

In Fig. 1, we find that the blur of spectral images not only affects the image resolution but also the accuracy of spectra, resulting from the greatly increasement of spectral mixture in blurred spectral images. Experiment shows that image motion of more than 10 pixels will bring distinguished spectrum distortion. Therefore, blurred spectral images restoration is important for CTIS image spectrometer to acquire accurate spectrum in practical application.

In the area of image restoration, there exist many classic algorithms, such as iterative method [4], wiener filtering [5] and inverse filtering [6]. However, these effectual methods often require the knowledge of image motion kernel. Attitude sensor equipped on satellite is able to measure the attitude change induced image motion. However, the precision of attitude sensor such as star-sensor or gyros system is not accurate enough for high orbit application, since the instantaneous field of view of the spectrometer will be less than 3 × 10−5 degree. Furthermore, the attitude sensor is not capable to measure the relative image motion speed resulting from earth rotation and satellite movement either. Consequently, it becomes a difficult problem to restore the blurred spectral images using conventional method.

The objective of this work is firstly to study the property of the diffracted beam pattern with the description of the principle of the CTIS, and then Section 2 is followed to describe the characteristics of zeroth order beam. Section 3 covers a study of zeroth order beam, which is based on a motion detection technique. Furthermore, experiment in Section 4 proves that our method enables accurate and robust spectral image restoration.

2. CTIS Principle and diffraction pattern

The principle of CTIS is shown in Fig. 2. It consists of objective lens, field stop, collimator disperser, re-imaging lens and a 2D focal plane array (FPA). An image of objects is formed by the objective lens onto the field stop. The intermediate image is then collimated, passed through a 2D disperser, and reimaged onto the focal plane, where a CCD or CMOS detector is placed.

 figure: Fig. 2

Fig. 2 (a) Optical layout of the CTIS imaging spectrometer. (b) The diffraction pattern of CTIS in FPA

Download Full Size | PDF

The 2D disperser is often composed of a pair of cosine gratings crossed at 90° or a computer-generated holographic disperser (CGH). The disperser diffracts a pattern of several projections onto the image plane. At the center, a panchromatic image of the scene is formed by the zeroth order diffraction beams of the disperser. The distributed patterns of the other diffraction orders are several monochromatic copy version of the zeroth order image, the wavelength of which is changed as the spatial location changes. A mathematical procedure of tomographic reconstruction which is frequently used in medical computed tomography can be adopted to reconstruct an estimate of the spectral data cube from the multiple projection orders. Therefore, CTIS is able to determine the spectra at each pixel in the scene with a single snapshot.

According to the principle of the CTIS, there are some noticeable properties with the footprint, as shown in Fig. 3.

 figure: Fig. 3

Fig. 3 (a) The diffraction pattern of CTIS. (b) The property of image shift consistency and energy distribution in different diffracted order beams

Download Full Size | PDF

Firstly, the projections of the other order diffraction beams have the same field of view with the zeroth order image, since the monochromatic images in each diffraction order can be considered as shifting the zeroth order image in a particular direction for each order. Therefore, when a platform-dither-induced scene movement occurs, the zeroth order image encounters the same motion value with the other order diffraction images. Secondly, the panchromatic zeroth order image is formed as multiple monochromatic images overlapped together. An experiment using a white LED as the target shows that, the grayscale of the zeroth order image diffracted by the CGH is about 180 while the grayscale of the other order pattern is about 8. Therefore, the energy of zeroth order image is 20 times higher than the other orders. If using a pair of crossed cosine gratings as the disperser, the ratio will even reach more than 100 times (calculated roughly when the diffraction efficiency of zeroth order beam equals to the other orders, and the CTIS’s spectral resolution is 5 nm with a wavelength range of 500 nm). Thirdly, different from higher order monochromatic images, the zeroth order image does not overlap with the other orders. Therefore, image motion information can be resolved without the disturbance from other orders. As a consequence, the zeroth order beam region is a natural short exposure image motion detection channel. Using multiple sequent short-exposure-time zeroth order images, image motion can be accurately and reliably measured because of high SNR.

3. Motion detection method

Benefitting from the unique characteristics of zeroth order beams, a novel blur free CTIS imaging spectrometer based on customized detector is proposed and schematically illustrated in Fig. 4(a). The optical configuration of our blur free system is just the same as the traditional optical layout shown in Fig. 2. However, we here use a FPA in which different areas can operate in different frame rates. As shown in Fig. 4(a), if the FPA is designed specifically, the zeroth order beam region and the other higher order beam region of FPA are able to operate in high and low frame rate respectively in the same period. Therefore, the motion kernel of the blurred spectral image, which is reconstructed from long-exposure-time diffraction pattern, can be resolved from multiple high-frame-rate zeroth-order images.

 figure: Fig. 4

Fig. 4 (a) Optical layout of the blur-free CTIS imaging spectrometer. (b) The architecture of the customized CMOS detector with independent exposure time at different regions.

Download Full Size | PDF

According to the existing research, designing this specific FPA is feasible. A CMOS sensor in which each pixel can set an independent exposure time [7,8] has already been developed, which is mainly based on the use of individual pixel reset architecture. Each pixel in a traditional CMOS includes a photogate, a reset transistor gate, and readout circuitry. Since all reset transistor gates in a row are connected together in traditional CMOS sensor, the entire row will be reset simultaneously. The integration time of a pixel is mainly constrained by the reset frequency, so the integration time of the pixel in the same row is identical. As shown in Fig. 4(b), by putting a second transistor in series with the row reset transistor, the pixel reset will be activated only when both the RRST (row reset signal) and CRST (column reset signal) are reset. For instance, if the signal of RRST1 is always active, the integration time of pixels in the first row is determined by the reset frequency of the CRST signal, which can be set differently in different columns. As a result, the pixels on the same row can have different integration periods. Similarly, the integration period of pixels on the same column can also be different. If the RRST and CRST are set appropriately, the integration time of different rectangular regions of the CMOS sensor can be set independently. On the other hand, high-frame-rate CMOS detectors having the windowing readout function are also commercially available, which can output different region at different frames. This function can be realized by setting the signals of RS (row select readout) and column select readout in Fig. 4(b) appropriately.

Combining these two technologies together, a CMOS can be realized with its center area at a short exposure time with high frame rate readout and remaining area at a longer exposure time with lower frame rate readout. Therefore, in the same period, it is able to acquire a series of short-exposure-time images of zeroth order area, with one long-exposure-time image of the other diffracted order area. Our method uses the short-exposure-time zeroth order image sequence to gauge the image motion kernel resulting from platform shaking and motion. Meanwhile, we use the zeroth order and the other order images combinedly to reconstruct the blurred spectral data cube. The advantage of high energy in panchromatic image ensures that the short exposure zeroth order channel possesses enough SNR for motion detection. After calculating the consecutive zeroth order image displacements, the blur kernel for long-exposure-time spectral image deblur can be constructed.

The customized detector based scheme has the advantages of little image data amount and fast acquisition speed. However, such a detector with independent exposure time in each pixel is not easy to obtain. Therefore, it is better to consider using a commercial CMOS sensor as an alternative. The following discussion is based on the global shutter CMOS detector which is commercial-off-the-shelf.

As shown in Fig. 5, we assume the global shutter CMOS detector operates at the mode of integration while read with a short exposure time of Δt. The imaging spectrometer is then set to capture N flames in moving platform. After extracting the sequential zeroth order sub-images from the N images, their displacements can be calculated by many classical image registration algorithms, such as phase correlation [9], mutual information [10] and joint transform correlation (JTC) [11].

 figure: Fig. 5

Fig. 5 (a) The image preprocessing for motion detection and image restoration. (b) The pixel schematic of a CMOS sensor. (c) The mode of integration while read of CMOS detector. (d) The relationship between adding multiple short-exposure-time frames and a long-exposure-time frame.

Download Full Size | PDF

On the other hand, the N sequential original images are overlapped to form a blurred high energy image. Figure 5(b) shows the pixel schematic of a CMOS sensor [8]. When operated in the global shutter mode, all pixels are exposured simultaneously. At the end of the integration time, the charge transfer operations for all rows happen simultaneously by a TG pulse. Therefore, the charges produced by the photodiodes are transferred to the FD (floating diffusion node). Using RS (row select readout) and column select readout signals, the voltage of the floating diffusion node can be readout. Once the charge transfer operation is completed, all the photodiodes can be set into the next exposure cycle. In addition, if the voltage readout time of all pixels is less than the exposure time of Δt, the image of next frame can be exposured while simultaneously reading out charge from the previous exposure, shown in Fig. 5(c). As a result, in this operation mode, the time gap between two frames is only determined by the charge transfer time. The charge transfer time (inter frame gap) of some high frame-rate CMOS sensors is designed to be smaller than 2 μs [12], which is negligible compared to the exposure time (several milliseconds). The procedure of charge integration is shown in Fig. 5(d), and e(t)is the number of charges produced by the photodiode at different time in a given pixel. Using integration formula, the charges produced in the long exposure time from t1 to t6 equals tot=t1t6e(t)dt. If the long exposure time is divided to multiple short-exposure-time parts with a time gap of the charge transfer time which is negligible, the sum of charges integrated in multiple short-exposure-time will equal approximately to the charges exposured from t1 to t6. Therefore, the equation in Fig. 5(d) illustrates the equality between adding multiple short-exposure-time frames and a long-exposure-time frame. When the short exposure time is 10 ms and charge transfer time is 2 μs, the relative error using this equivalent method is less than 0.02% if the detector noise is ignored.

Based on the above principles, the equivalent exposure time of formed blurred image is equal to Δt×N. Then the blur kernel of the high energy blurred image can be clarified using the N image displacements. After deconvoluting the high energy blurred image using the constructed blur kernel, a sharper diffraction pattern will be obtained. Ultimately, the accurate spectral data cube can be rebuilt from the restored image pattern. In order to reduce data size and improve transmission speed, sub-images extraction and images accumulation can be processed on circuit board with DSP or FPGA.

4. Experiment Setup

A motion-blur-free CTIS imaging spectrometer based on a high frame rate CMOS sensor has been built, the experimental configuration of which is illustrated in Fig. 6.

 figure: Fig. 6

Fig. 6 (a) The prototype of blur free CTIS system. (b) The experiment platform to investigate spectral image restoration.

Download Full Size | PDF

The CTIS system is mounted above a moving platform, with an object distance of around 0.4 meter. Two stepper motors control the moving platform to move in two directions. A target with ununiformed spectral characteristics can be carried by the moving platform to simulate motion-induced spectral image blur. The route and speed of the platform is programmable, which contributes to performing various motion kernels. The detector for capturing diffracted images here is a CMOS senor (Andor Neo 5.5) owning 2560 × 2160 pixel format with 6.5 × 6.5 μm pitch size, which also features a high frame-rate of 100 Hz. The center area corresponding to zeroth order beam imaged is constrained by field stop to 324 × 235 pixels. The prototype’s focal length is 33 mm, with the field of view of 4.8 degrees. The 2D diffraction component in our CTIS is a computer-generated holographic disperser with the ability of diffracting 23 order beams, which is recorded by a color camera in Fig. 7.

 figure: Fig. 7

Fig. 7 The diffraction pattern of our CTIS recorded by a color camera.

Download Full Size | PDF

The selection of demonstrating target is flexible. Some multicolor book covers or oil paintings illuminated by a uniform LED array are applicable. Another decent option is to use a large LCD screen embed in panel computer, which is valuable in the user defined image display and the flexible brightness adjustment. In our experiment, we put an iPad on the platform and preset the motion route. An image sequence of 50 image frames with a short exposure time of 10 ms were captured successively, a portion of which are shown in Fig. 8. The CMOS detector is working in the mode of integration while read, and the inter frame gap is 2 μs and neglectable. Therefore, a motion-induced blur image with long-exposure-time of 500 ms can be formed by adding these 50 short-exposure-time images together. The accumulated image can be regarded as the blurred image for image restoration. We also capture one image with long-exposure-time of 500 ms when the platform is motionless, for generating the true spectral data to evaluate the performance of spectral image restoration.

 figure: Fig. 8

Fig. 8 (a)-(f) The zeroth order imaged captured at 1st-order, 10th-order, 20th-order, 30th-order, 40th-order and 50th-order respectively. (These images are cropped to enlarge the target.)

Download Full Size | PDF

In order to calculate the displacement between two adjacent images, the article applies the Joint Transform Correlation method, which is widely used in image registration due to its high accuracy. JTC also possesses high-speed processing ability, when it is implemented by DSP or optical joint transform correlator. The sequential image displacement in X and Y directions calculated by JTC are shown in Fig. 9(a), where the first image is chosen as the reference image.

 figure: Fig. 9

Fig. 9 (a) The sequential image displacements in X and Y directions. (b) The constructed motion kernel using 2D displacements. (c) The zeroth order image area of combined diffraction pattern using 50 short-exposure-time images.

Download Full Size | PDF

After obtaining the sequential image displacements, there are several methods can be used to construct the long-exposure-time blur kernel [13]. The motion kernel h(x,y) is calculated by the relation

h(x,y)=i=1N11/(Δt×vi(x,y))

where Δt is the short exposure time, and vi(x,y)is the velocity vector between two frames. When the exposure time is short, Δt×vi(x,y)equals to the motion displacement vector. The calculated blur kernel using Eq. (1) is shown in Fig. 9(b). Meanwhile, the 50 short-exposure-time images are also added to form a blurred diffraction pattern. The zeroth order image area of the pattern is presented in Fig. 9(c).

Many algorithms are capable to restore the diffraction pattern image with the knowledge of PSF, such as inverse filtering, wiener filtering and iterative method. We here use the iterative method to restore the accumulated blurred diffraction pattern using the motion kernel. The restoration procedure is implemented in MATLAB using Lucy-Richardson for convenience. The parameter of the threshold deviation for suppressing damping is 0.01, while other parameters are all default values. And after 20 iterations, the result appears to converge. The restored result of zeroth order area is shown in Fig. 10(c). The motionless and blurred zeroth order images are demonstrated in Fig. 10 as well.

 figure: Fig. 10

Fig. 10 (a) The motionless zeroth order image. (b) The blurred zeroth order image. (c) The restored zeroth order image. (The ROI of the target is marked red and blue in these images)

Download Full Size | PDF

It is obvious that the recovered image has a great improvement in resolution. However, it is more important to investigate the improvement of spectral accuracy. For this purpose, we use the recovered diffraction pattern to reconstruct data cube using tomographic reconstruction method [14]. Meanwhile, the spectral data cube of long-exposure-time motionless image is also reconstructed, which is regarded as the true spectrum. Consequently, Fig. 11 shows a comparison of the true, the blurred and the recovered spectrum, where the coordinates of two ROI are (144, 115) and (140, 90). In Fig. 11(a), the blurred spectrum of the red object is distorted due to the energy from the nearby blue pixels, so the restored spectrum will have bigger improvements in red bands and blue bands. In Fig. 11(b), similarly, the spectrum of the green object is distorted because of the nearby red pixels, so the restored spectrum will have bigger improvements in red bands and green bands.

 figure: Fig. 11

Fig. 11 A comparison between the original spectrum, blurred spectrum and restored spectrum. (a) The coordinate of ROI is chosen at (144, 115). (b) The coordinate of ROI is chosen at (140, 90).

Download Full Size | PDF

From this comparison, it is discovered that the true spectrum and the recovered spectrum have quite fairly curve and gray scale, which confirms our method is able to improve the spectral accuracy. In order to investigate the influence of motion displacement accuracy to spectrum restoration, we also introduce random artificial errors in the calculated displacements with a standard deviation of 0.1 pixel to generate a new motion kernel. Then the blurred diffraction pattern is restored using this new kernel. After spectral reconstruction, the average of grayscale deviation between two restored spectra was observed to be less than 2.5. Since the motion detection accuracy of JTC method is better than 0.1 pixel [15], it has indicated the robustness of our spectral restoration method.

5. Conclusions

In this paper, we provide evidence suggesting that the proposed method for restoring spectral images of CTIS spectrometer using zeroth order beam area is a robust advanced technique. Initially, it illustrates the characteristics of diffracted pattern of CTIS, and the zeroth order beam is found to be a natural motion detection channel. Subsequently, it demonstrates the high speed CMOS configuration to detect the motion of CTIS’s zeroth order beam. Ultimately, experiment results of spectral restoration show that this method provides sufficient advance in the reliability and accuracy for CTIS hyperspectral images restoration. We believe this study will effectively promote further high spatial resolution application of spaceborne CTIS.

Acknowledgments

The research leading to these results has received funding from the Zhejiang Provincial Natural Science Foundation of China (LQ15F050004) and the National Natural Science Foundation of China (61505192).

References and links

1. J. F. Scholl, E. K. Hege, D. G. O’Connell, and E. L. Dereniak, “Hyperspectral datacube estimations of binary stars with the computed tomographic imaging spectrometer (CTIS),” Proc. SPIE 7812, 78120I (2010). [CrossRef]  

2. W. L. Hayden, T. McCullough, A. Reth, and D. Kaufman, “Wide-band precision two-axis beam steerer tracking servo design and test results,” Proc. SPIE 1866, 271–279 (1993). [CrossRef]  

3. B. Xue, X. Chen, and G. Ni, “Image quality degradation analysis induced by satellite platform harmonic vibration,” Proc. SPIE 7513, 75130N (2009). [CrossRef]  

4. M. Elad and A. Feuer, “Restoration of a single superresolution image from several blurred, noisy, and undersampled measured images,” IEEE Trans. Image Process. 6(12), 1646–1658 (1997). [CrossRef]   [PubMed]  

5. T. P. Costello and W. B. Mikhael, “Efficient restoration of space-variant blurs from physical optics by sectioning with modified Wiener filtering,” Digit. Signal Process. 13(1), 1–22 (2003). [CrossRef]  

6. M. A. Klompenhouwer and L. J. Velthoven, “Motion blur reduction for liquid crystal displays: motion-compensated inverse filtering,” Proc. SPIE 5308, 690–699 (2004). [CrossRef]  

7. O. Yadid-Pecht and A. Belenky, “In-pixel autoexposure CMOS APS,” IEEE J. Solid-St. Circulation 38(8), 1425–1428 (2003).

8. O. Yadid-Pecht, B. Pain, C. Staller, C. Clark, and E. Fossum, “CMOS active pixel sensor star tracker with regional electronic shutter,” IEEE J. Solid-St. Circulation 32(2), 285–288 (1997).

9. H. Foroosh, J. B. Zerubia, and M. Berthod, “Extension of phase correlation to subpixel registration,” IEEE Trans. Image Process. 11(3), 188–200 (2002). [CrossRef]   [PubMed]  

10. J. P. W. Pluim, J. B. A. Maintz, and M. A. Viergever, “Mutual-information-based registration of medical images: a survey,” IEEE Trans. Med. Imaging 22(8), 986–1004 (2003). [CrossRef]   [PubMed]  

11. Y. Qian, Y. Li, J. Shao, and H. Miao, “Real-time image stabilization for arbitrary motion blurred image based on opto-electronic hybrid joint transform correlator,” Opt. Express 19(11), 10762–10768 (2011). [CrossRef]   [PubMed]  

12. Andor Company, “PIV Mode for Neo and Zyla,” http://www.andor.com/learning-academy/piv-mode-for-neo-and-zyla-particle-imaging-velocimetry.

13. G. Boracchi and A. Foi, “Modeling the performance of image restoration from motion blur,” IEEE Trans. Image Process. 21(8), 3502–3517 (2012). [CrossRef]   [PubMed]  

14. H. C. Schau, R. E. Soulon, G. L. Hall, and D. J. Garrood, “Restoration, target recognition, and countermeasure removal using the computed tomographic imaging spectrometer (CTIS),” Proc. SPIE 4725, 521–539 (2002). [CrossRef]  

15. K. Janschek, V. Tchernykh, and S. Dyblenko, “Performance analysis of opto-mechatronic image stabilization for a compact space camera,” Control Eng. Pract. 15(3), 333–347 (2007). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1
Fig. 1 Spectrum distortion resulted from platform jitter. (a) The sharp zeroth order image of a painting captured in motionless platform. (b) The blurred zeroth order image of the painting captured in moving platform. (c) The real spectrum and blurred spectrum of the pixel at the coordinate of (79, 100).
Fig. 2
Fig. 2 (a) Optical layout of the CTIS imaging spectrometer. (b) The diffraction pattern of CTIS in FPA
Fig. 3
Fig. 3 (a) The diffraction pattern of CTIS. (b) The property of image shift consistency and energy distribution in different diffracted order beams
Fig. 4
Fig. 4 (a) Optical layout of the blur-free CTIS imaging spectrometer. (b) The architecture of the customized CMOS detector with independent exposure time at different regions.
Fig. 5
Fig. 5 (a) The image preprocessing for motion detection and image restoration. (b) The pixel schematic of a CMOS sensor. (c) The mode of integration while read of CMOS detector. (d) The relationship between adding multiple short-exposure-time frames and a long-exposure-time frame.
Fig. 6
Fig. 6 (a) The prototype of blur free CTIS system. (b) The experiment platform to investigate spectral image restoration.
Fig. 7
Fig. 7 The diffraction pattern of our CTIS recorded by a color camera.
Fig. 8
Fig. 8 (a)-(f) The zeroth order imaged captured at 1st-order, 10th-order, 20th-order, 30th-order, 40th-order and 50th-order respectively. (These images are cropped to enlarge the target.)
Fig. 9
Fig. 9 (a) The sequential image displacements in X and Y directions. (b) The constructed motion kernel using 2D displacements. (c) The zeroth order image area of combined diffraction pattern using 50 short-exposure-time images.
Fig. 10
Fig. 10 (a) The motionless zeroth order image. (b) The blurred zeroth order image. (c) The restored zeroth order image. (The ROI of the target is marked red and blue in these images)
Fig. 11
Fig. 11 A comparison between the original spectrum, blurred spectrum and restored spectrum. (a) The coordinate of ROI is chosen at (144, 115). (b) The coordinate of ROI is chosen at (140, 90).

Equations (1)

Equations on this page are rendered with MathJax. Learn more.

h(x,y)= i=1 N1 1/(Δt× v i (x,y))
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.