Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Toward omnidirectional and automated imaging system for measuring oceanic whitecap coverage

Open Access Open Access

Abstract

Accurate measurements of the oceanic whitecap coverage from whitecap images are required for better understanding the air–gas transfer and aerosol production processes. However, this is a challenging task because the whitecap patches are formed immediately after the wave breaks and are spread over a wide area. The main challenges in designing a whitecap-imaging instrument are the small field of view of the camera lens, processing large numbers of images, recording data over long time periods, and deployment difficulties in stormy conditions. This paper describes the design of a novel high-resolution optical instrument for imaging oceanic whitecaps and the automated algorithm processing the collected images. The instrument was successfully deployed in 2013 as part of the HiWINGS campaign in the North Atlantic Ocean. The instrument uses a fish-eye camera lens to image the whitecaps in wide angle of view (180°).

Published by The Optical Society under the terms of the Creative Commons Attribution 4.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.

1. INTRODUCTION

Whitecaps formed on the ocean surface after the passage of breaking waves play a significant role in many marine and atmospheric processes, such as air–sea gas exchange [13], marine aerosol production [46], and global radiation balance [7]. To investigate these processes, precise quantification of the oceanic whitecaps is required during the first instance after a wave breaks with adequate temporal and spatial resolutions. The most common way to quantify whitecaps is the whitecap coverage (or foam fraction), which is the percentage area of the ocean surface covered by the whitecaps [2,8]. This whitecap coverage can be measured from the ocean surface images.

In general, there are two ways to image the oceanic whitecaps: satellite and marine imaging. The goal of satellite imaging is to measure the emitted or reflected energy by the whitecaps from space. The whitecaps cover more than 1% [9] of the ocean surface, and the reflectance of the recently formed whitecaps in the visible spectrum is approximately 10 times [9] larger than the reflectance of the adjacent whitecap-free features at the sea surface. Thus, the sensitivity of the satellite sensor to measure the whitecap radiance is significantly improved [9]. Wentz [10] used a microwave radiometer to measure the microwave emission coming from the ocean surface. He found that the microwave emissivity of the ocean surface is proportional to the whitecaps, and the retrieving of both wind speed and direction depends on this emissivity. Anguelova and Webster [2] developed a method to measure the whitecap coverage on the global scale from satellite measurements. The satellite imaging techniques require sophisticated instruments and image-processing tools to analyze the data. The main drawback of IR and optical imaging is that they are very susceptible to the weather conditions. For instance, clouds can occlude some of the solar radiation. Microwave radiometry provides all-weather day-and-night observations of sea state and whitecaps; only extreme meteorological conditions limit the performance of microwave radiometry at some frequencies. The main disadvantage of microwave imaging is that it has a large footprint (low resolution).

Marine imaging is based on using ships, research platforms, or buoys to collect whitecap images. For instance, Frouin et al. [11] measured the spectral reflectance of the sea whitecaps at Scripps Institution of Oceanography Pier by using radiometers that detect the visible and near-IR energy. Ships or buoys can be used for direct measurements in the open sea. However, very few measurements can be obtained from the ship in extremely severe conditions. Therefore, an autonomous buoy with onboard controls and data-logging systems is a viable way to conduct measurements in severe conditions [12]. Brooks et al. [12] used a still camera fixed to an autonomous free-floating spar buoy to verify breaking waves on the wave wires. The quality of the captured images is high. However, the limit of the camera view angle has important implications for the calculation of whitecap coverage. Many cameras with standard view angles are required to image whitecap coverage on the wide range. However, this system requires additional hardware resources that substantially increase the economic and technical challenges. In addition, it would be more difficult to analyze and combine the collected data. This paper describes the design of a novel instrument for whitecap imaging and the automated extraction algorithm to analyze the collected images. The instrument is based on using a fish-eye lens camera that has a large view angle (180°). This instrument could be further developed in the future to measure the whitecap coverage from all directions.

2. OPTICAL INSTRUMENT FOR IMAGING WHITECAPS IN THE OCEAN

A. Overview

The optical instrument was fitted within a transparent waterproof housing at the top of the spar buoy (dome) as shown in Figs. 1 and 2. The instrument components were mounted on a steel disk to fix them firmly inside the dome and isolate them from other instruments. The camera was fixed using a ball-head adapter and positioned to look down at the ocean surface. The power to the individual components of the instrument was supplied by a power management board. The power input to this board was controlled by a programmable timer. Prior to deployment, the timer was programmed to synchronize the whitecap camera with a bubble camera that was mounted below the ocean surface. The instrument operates during daylight to conserve power and data storage.

 figure: Fig. 1.

Fig. 1. Hardware components of the whitecap instrument. (a) shows the JAI camera and the single-board computer assembly inside the buoy dome. The single-board assembly in (b) consists of the following components: power management board, single-board computer, two solid-state drives, and the LCD display module.

Download Full Size | PDF

 figure: Fig. 2.

Fig. 2. Spar buoy during deployment in the ocean. The length of the spar buoy is 11 m. The whitecap-imaging instrument discussed in this paper is fixed inside the spar buoy.

Download Full Size | PDF

The electronic components of both whitecap and bubble imaging instruments were chosen to be identical to simplify design and provide flexibility in deployment. The only difference between the instruments was in the camera lens and the light source. A fish-eye camera lens was coupled to the camera in the whitecap-imaging instrument to provide a large field of view. The whitecap camera does not need a light source for illumination, as it depends on the ambient light. A brief description of the electronic components is given in the next section, but full details about the design and choice of components can be found in Al-Lashi et al. [13].

B. Hardware Architecture

The imaging hardware consists of three main components, as shown in Fig. 3. These are the power management board to supply the necessary power to other components, the machine vision camera coupled with a fish-eye lens, and the single-board computer that controls image streaming to the storage devices.

 figure: Fig. 3.

Fig. 3. Block diagram of the whitecap-imaging instrument architecture. The light lines correspond to the power transfer from the power management board to the other electronic components, the thick bold lines correspond to the data transfer from and to the single-board computer, and the broken lines represent the single-board computer control on other devices.

Download Full Size | PDF

The power requirement of each individual component of the instrument is different. Therefore, a power management board was designed to provide separate isolated supplies from the sea battery power source (24 V, 40 A h). The board consists of 24 V–12 V and 24 V–5 V DC–DC converters, light-emitting diode (LED) driver to illuminate the backlight of the LCD module, four connectors to supply 5 and 12 V, camera connector, and a power metal-oxide-semiconductor field-effect transistor switch circuit to conserve power by powering down the camera when not in use. The supplied power to the camera and single-board computer are 12 and 5 V, respectively.

A JAI (BM-500 GE) machine vision camera was used to image the ocean surface. The camera has a 2/3 CCD sensor with a 2048×2048 resolution and a 3.45μm×3.45μm cell size, and it operates in a progressive scan mode. The camera frame rate is 15 frames/s, and its minimum exposure time is 64 μs. A high-resolution 5-megapixel fish-eye lens (Fujinon FE185C057HA-1) is mounted on the camera sensor.

Crucial M500 solid-state drives with 960 GB capacity were selected as data storage devices due to their robustness (vibration resistance), high read/write speed (3 Gb/s), small form factor (2.5 in.), and wide operating temperature (0°C–70°C). The storage capacity required for an hour of camera recording at 15 frame/s with 2048×2048 image resolution is 226 GB. Therefore, two 960 GB M500 solid-state drives were used to extend the camera recording time to approximately 8.5 h.

A single-board computer [Kontron pITX-SP 2.5 SBC (plus) with Intel Atom Z530 processor] was used to control the operation of the JAI camera and save the captured images on the solid-state drives. It streams the images via a gigabit Ethernet interface and transfers these images to the solid-state drives via Serial AT Attachment interfaces. It is characterized by a small form factor (104mm×78mm), operating temperature in the range from 0°C to 60°C, passive cooling option, and 1.6 GHz CPU clock. A passive heat sink was mounted on the single-board computer to significantly decrease the CPU temperature. A Video Graphics Array LCD display module (HDA570V-G) was used to display the diagnostic messages in the instrument operation and fixed to be visible through the buoy dome. It was interfaced with the single-board computer by an adapter display board (KAB-ADAPT-LVDS to TTL).

3. DEPLOYMENT IN THE OCEAN

The whitecap-imaging instrument was mounted on a large free-floating buoy and deployed seven times from the R/V Knorr in the North Atlantic Ocean in 2013 as part of the HiWINGS campaign to study air–sea interactions during high wind events. The length of the deployments varied from a few hours to five days. During these deployments, the wind speed ranged from 10 to 35 m/s, and the wave height varied from 1 to 10 m. The design and performance of the spar buoy are explained in Pascal et al. [14]. The whitecap camera was fitted inside a waterproof sphere (dome) at the top of a free-drifting spar buoy. The dome was approximately 2 m above the ocean surface when the spar buoy was free floating upright. Figure 2 shows the whitecap-imaging instrument fixed within the dome of the free-drifting buoy during deployment in the sea.

Figure 4 shows a sample of the images collected by the whitecap camera during deployment in the North Atlantic Ocean. The quality of images is high, but they contain raindrops and some parts of the buoy’s frame and ropes. The images can be classified into two main categories: whitecaps images and complex images. Complex images form when a breaking wave covers the buoy’s dome, as illustrated in Figs. 4(c) and 4(d). The features observed in such complex images are not relevant to whitecap analysis.

 figure: Fig. 4.

Fig. 4. Sample of the whitecap camera images, collected during deployment in the North Atlantic Ocean in 2013. The images in (a) and (b) contain whitecap patches. The images in (c) and (d) result from covering the buoy’s dome by the oceanic wave.

Download Full Size | PDF

4. AUTOMATED WHITECAP EXTRACTION ALGORITHM

Developing a robust automated whitecap extraction algorithm to analyze the images shown in Fig. 4 is a nontrivial task and requires many careful considerations. The distortion caused by the fish-eye lens needs to be removed. The images contain many other features that are not related to the whitecap; therefore, they need to be accurately identified and masked out.

Several automated algorithms have been developed in the past to analyze whitecap images. Callaghan and White [8] described a thresholding technique that separates the whitecaps from the rest of the sea surface by determining an appropriate threshold for each image. This threshold is calculated from an image structure that is a function of the features within this image. Lafon et al. [15] used an image gradient for threshold calculations and a numerical method for detecting the whitecap contours. These algorithms are usually applied to analyze images that do not contain uneven illumination, raindrops, and features from complex images.

This section provides details of a new automated extraction algorithm that has been developed to analyze the images collected by the whitecap-imaging instrument described in this paper.

A. Algorithm Details

1. Camera Calibration

The benefit of using a wide-angle fish-eye lens is to significantly increase the observed area of the ocean. However, the fish-eye lens creates a high level of nonlinear geometrical distortion, such as radial and tangential distortion. The algorithm extraction accuracy will substantially decrease if this distortion is ignored. Thus, it is essential to correct this distortion before any further processing.

A camera calibration technique that is based on the algorithm proposed by Zhang [16] has been used to obtain the calibration parameters. The technique involves an analytical closed-loop solution to calculate the intrinsic and extrinsic parameters followed by a nonlinear refinement of these parameters based on a maximum-likelihood criteria. The five intrinsic parameters are focal length of the two-image axis, principal point coordinates, skew coefficients of the two-image axis, and the distortion coefficients (radial and tangential coefficients). The estimated extrinsic parameters are the rotation and translation that connect the world coordinate system with the camera coordinate system.

Images of a calibration object under different orientations are required to implement this technique. Therefore, 25 snapshots of a classical black-and-white chessboard have been taken from several views and at different distances from the camera. The intrinsic and extrinsic calibration parameters are obtained from the chessboard images. Figure 5 shows a corrected image using these calibration parameters. After calibration, the image quality is greatly improved by eliminating the geometrical distortion effects.

 figure: Fig. 5.

Fig. 5. Image correction by the calibration parameters. (a) shows the original distorted image before the calibration. The geometrical distortion is removed in (b) after performing the calibration.

Download Full Size | PDF

2. Pre-Filtering

The pre-filtering operation aims to remove (filter out) the buoy’s frame and rain or spray drops if present before further image processing [Fig. 5(a)]. The removal of the buoy’s frame [as in Fig. 5(b)] is accomplished before performing the camera calibration. The location of the buoy’s frame is fixed in all images. Therefore, the coordinates of this feature are identified and their intensity values are set to zero.

A median filter with 15×15 mask is used to filter out the raindrops. The new intensity point in the filtered image is calculated by taking the median of neighboring points in the mask at the same place in the original image [17,18]. This operation is achieved after performing image calibration.

The buoy’s ropes, visible at the horizon, are removed in the post-filtering image processing (Section 4.A.5). The contour identification algorithm will not recognize the entire horizon contour if the horizon buoy’s ropes filter out in the pre-filtering stage. This is because this identification algorithm is based on finding connected white points for a particular feature, as illustrated in Section 4.A.4 [see Figs. 6(a) and 6(b)].

 figure: Fig. 6.

Fig. 6. Whitecap extraction algorithm steps: (a) pre-filtering and adaptive thresholding, (b) contours identification, (c) after third post-filtering stage, and (d) after applying post-filtering operation. The red and green lines correspond to the identified contours and bounding rectangles, respectively.

Download Full Size | PDF

3. Adaptive Thresholding

Thresholding is an image segmentation technique that is used to extract objects (features) from their backgrounds. There are two main types of thresholding: uniform and adaptive [17,18]. In uniform thresholding, pixel intensities above a certain brightness level (threshold) are set to 1 (white), whereas those below this specified level are set to 0 (black). This suggests that the object brightness needs to be known. In contrast, adaptive (optimal) thresholding is a more advanced technique that employs an optimal threshold to separate an object from its background.

The automated algorithm individually processes each image to determine an optimal threshold to separate whitecaps from the ocean background. Therefore, each image has a unique threshold that reflects the change in the ambient illumination. To find the appropriate threshold, the image is first split into 64×64 overlapping subimages. The optimal threshold of each subimage is calculated by the Otsu method [19]. The binary subimages emerging from thresholding are regarded as foreground masks and subtracted from white masks to obtain the corresponding background masks. The information in the foreground masks may not belong to the whitecap. Thus, further evaluation is required for these foreground masks. This can be achieved by calculating the foreground and background histograms of each subimage using the background and foreground masks. The intensity value that corresponds to the maximum number of pixels is selected in each individual histogram. As a result, two values of foreground and background intensity are obtained for each subimage. It has been found by the evaluation on a number of images that the difference between the foreground and background intensity should be larger than 40 to accept the subimage obtained from thresholding (foreground mask). Otherwise the thresholding is rejected and the pixels in the subimage are assigned to 0 (black). After completing the thresholding process, the output subimages are combined to form a single binary image. Figure 6(a) shows the binary image obtained from applying the adaptive thresholding technique on the image in Fig. 5(b).

4. Contour Identification

After image segmentation and removal of undesirable features, the next step is to inspect the resulting aggregate of segmented pixels to discriminate between the whitecap and false regions. This can be accomplished by first identifying the regions’ contours (also called borders or perimeters) and then examining the internal properties of the regions that enclose these contours to remove the false regions. The latter is illustrated in Section 4.A.5.

A border-following algorithm developed by Suzuki and Abe [20] has been used to extract the region borders. The algorithm analyzes the topological structure of binary images and finds a point on the contour between a connected component of 1 and 0 pixels. Then it progresses around this contour to find the next point. Figure 6(b) shows the contours identified by the border-following algorithm. The identified contours (red lines) are enclosed by rotated rectangles (green lines). The bounding rectangle finds a minimum area for specified contour points.

5. Post-Filtering

Four stages of post-filtering operation are applied to filter out unwanted regions. The horizon contour is removed in the first stage. This is achieved by sorting the identified contours according to their areas and removing the contour that has the biggest area (horizon contour). The horizon contour has the biggest bounding rectangle, as shown in Fig. 6(c). Complex images [Figs. 4(c) and 4(d)] are detected by evaluating the area of the horizon-bounding rectangle. It is found by evaluation on a number of images that this area is less than 250,000 pixels. Contours that lie inside the horizon-bounding rectangle are also removed in this stage. This is accomplished by comparing the centroid location of each contour with the centroid of the horizon contour. The area of the region contours is calculated using Green’s theorem [21]. The small regions, such as remaining raindrops, are removed in the second stage. It is found by the evaluation on a number of images that the areas of small regions are less than 2500 pixels. Figure 6(c) shows the remaining contours after the second post-filtering stage. In the third stage, the histogram peak intensity of each contour is compared with the average histogram peak intensities of the entire remaining contours. The contours are removed if their peak intensities are less than the total average intensity and the threshold of the minimum histogram peak (40). The fourth stage involves checking numbers of consecutive points that lie on a straight line on the contour perimeter. The contour is discarded if there are more than 15 consecutive points in the x and y directions. After post-filtering, the remaining contours should correspond to the whitecap regions in the image, as shown in Fig. 6(d).

6. Contour Information

Contour information, such as centroid, area, coordinates, and angle of rotation, is required to study the correlation between the whitecap regions and the bubble plumes under the surface of the ocean. The whitecaps are imaged by the whitecap-imaging instrument described in this paper, whereas the bubble plumes are imaged by another optical imaging instrument below the ocean surface [13]. The centroid (central moment) and area of each contour are calculated using Green’s formula [21]. The centroid is plotted as a small circle, as shown in Figs. 6(c) and 6(d). Each contour is enclosed by a minimum-area-bounding rectangle to determine the contour coordinates and its angle of rotation with respect to the x axis. Therefore, details of each bounding rectangle, such as height, width, corner coordinates, and angle of rotation, are recorded. For instance, the following information is obtained for the smaller extracted contour in Fig. 6(d): size, 8526 pixels; rectangular dimensions, 62×267 pixels at 15.7 deg; corners, (1090, 821), (832, 748), (849, 688), (1107, 761), and hull centroid, (977, 760).

This information, in combination with the effective pixel area, can then be used to compute the total whitecap coverage of a particular image.

B. Implementation

The algorithm rectifies the distortion in the images by using the calibration parameters obtained before running the whitecap extraction program. A camera calibration program is run only once, and its outputs are saved in an XML file. This XML file is loaded when running the extraction program. The algorithm was programmed in C++ and uses the OpenCV library for computational efficiency. The computation time to process an image is less than 2 s.

The implementation of the proposed algorithm is as follows.

  • 1. Load and read the source image.
  • 2. Mask the buoy’s frame.
  • 3. Remove the distortion from the image using the calibration file.
  • 4. Apply the adaptive thresholding to extract the necessary features. The output is a binary image.
  • 5. Apply median filtering to blur raindrops.
  • 6. Identify the contour of each region.
  • 7. Apply the post-filtering operation.
  • 8. Obtain the information from each contour.

Repeat steps 1–8 for the next image in the directory.

5. DISCUSSION AND SUMMARY

The focus of this paper is to develop an efficient and reliable optical instrument that is capable of imaging and analyzing oceanic whitecaps over a large area. The essential idea of this work is to use a high-resolution wide-angle fish-eye lens camera (180°) instead of a standard-lens camera with limited angle of view. This fish-eye lens camera has been shown to be successful in obtaining information on a wide area of the ocean surface. Moreover, it reliably deploys and operates in stormy conditions.

The quality of the collected images is high. However, there are two sources of distortions in these images. First, there is an obvious radial distortion in the images due to the fish-eye lens effect. The radial distortion is corrected by performing a camera calibration procedure, and this correction significantly increases the accuracy of whitecap extraction, as shown in Fig. 5. Second, the images contain some features that belong to the buoy structures. This adds more complexity to process these images. Thus, careful adjustments for positioning the whitecap camera are required in future deployments to avoid imaging buoy structures.

An automated image-processing algorithm has been developed to extract the whitecaps in the collected images. The change in the ambient illumination has no impact on the extraction accuracy. This is because each image has a unique threshold determined by the adaptive thresholding approach.

Complex images [such as those shown in Figs. 4(c) and 4(d)] were successfully distinguished from whitecap images [Fig. 4(b)] by inspecting the size of the horizon-bounding rectangle and discarded from further analysis with the automated algorithm.

Notwithstanding this success, further developments on the hardware design and automated processing algorithm are required to build a rigorous instrument for analyzing oceanic whitecap coverage. In the current instrument, the image acquisition is separate from the automated whitecap extraction algorithm. This is due to the limited computing power of the single-board computer that cannot cope with the computational demands of the extraction algorithm. Therefore, it could be advantageous to use a faster processor to allow immediate analysis of the captured images on the instrument.

There are two main limitations that degrade the extraction accuracy of the current image-processing algorithm. First, the segmentation approach (adaptive thresholding) is sensitive to spatial quantization errors. Thus, an enhanced image segmentation algorithm is required to avoid these spatial quantization effects. The second limitation is to use an absolute numbers of pixels as thresholds to filter out the false regions in the image. These thresholds have been selected by trial and error, and a more advanced approach based on a combination of machine learning and image processing would be advantageous to distinguish between the false and whitecap regions.

To fully compute the overall whitecap coverage, it is also necessary to determine the effective pixel area, which will vary in the image due to the imaging geometry; pixels nearer the horizon will image a larger area. In particular, for a camera deployed on a floating platform (and therefore in constant motion due to the wave field), corrections are required for the camera position and orientation relative to the water surface at the time of each individual image. This is the focus of further work.

Funding

Natural Environment Research Council (NERC) (NE/J020893/1); Advanced Manufacturing Supply Chain Initiative (AMSCI).

Acknowledgment

This work was supported by the UK’s Natural Environment Research Council (grant NE/J020893/1) at the University of Southampton. The paper writing was performed while author Raied S. Al-Lashi was a research fellow at the University of Leeds and was supported by Advanced Manufacturing Supply Chain Initiative funding of the Chariot Consortium. The authors would like to thank the scientists, officers, and crews of the R/V Knorr who took part in the HiWINGS campaign for their help during deployment of foam-imaging instruments in the North Atlantic Ocean.

REFERENCES

1. W. E. Asher, L. M. Karle, B. J. Higgins, and P. J. Farley, “The influence of bubble plumes on air-seawater gas transfer velocities,” J. Geophys. Res. 101, 12027–12041 (1996). [CrossRef]  

2. M. D. Anguelova and F. Webster, “Whitecap coverage from satellite measurements: A first step toward modeling the variability of oceanic whitecaps,” J. Geophys. Res. 111, 3017 (2006). [CrossRef]  

3. W. Asher, J. Edson, W. McGillis, R. Wanninkhof, D. Ho, and T. Litchendorf, “Fractional area whitecap coverage and air-sea gas transfer velocities measured during GasEx-98,” in Gas Transfer at Water Surfaces, M. A. Donelan, W. M. Drennan, E. S. Saltzman, and R. Wanninkhof, eds. (American Geophysical Union, 2002).

4. S. J. Norris, I. M. Brooks, B. I. Moat, M. J. Yelland, G. de Leeuw, R. W. Pascal, and B. Brooks, “Near-surface measurements of sea spray aerosol production over whitecaps in the open ocean,” Ocean Sci. 9, 133–145 (2013). [CrossRef]  

5. G. de Leeuw, F. P. Neele, M. Hill, M. H. Smith, and E. Vignati, “Production of sea spray aerosol in the surf zone,” J. Geophys. Res. 105, 29397–29409 (2000). [CrossRef]  

6. A. D. Clarke, S. R. Owens, and J. Zhou, “An ultra fine sea-salt flux from breaking wave: Implications for cloud condensation nuclei in the remote marine atmosphere,” J. Geophys. Res. 111, 6202 (2006). [CrossRef]  

7. R. Frouin, S. Iacobellis, and P.-Y. Deschamps, “Influence of oceanic whitecaps on the global radiation budget,” Geophys. Res. Lett. 28, 1523–1526 (2001). [CrossRef]  

8. A. H. Callaghan and M. White, “Automated processing of sea surface images for the determination of whitecap coverage,” J. Atmos. Ocean. Technol. 26, 383–394 (2009). [CrossRef]  

9. M. Stramska and T. Petelski, “Observations of oceanic whitecaps in the north polar waters of the Atlantic,” J. Geophys. Res. 108, 3086 (2003). [CrossRef]  

10. F. J. Wentz, “Measurement of oceanic wind vector using satellite microwave radiometers,” IEEE Trans. Geosci. Remote Sens. 30, 960–972 (1992). [CrossRef]  

11. R. Frouin, M. Schwindling, and P.-Y. Deschamps, “Spectral reflectance of sea foam in the visible and near-infrared: In situ measurements and remote sensing implications,” J. Geophys. Res. 101, 14361–14371 (1996). [CrossRef]  

12. I. M. Brooks, et al., “Physical exchanges at the air-sea interface: UK-SOLAS field measurements,” Bull. Am. Meteorol. Soc. 90, 629–644 (2009). [CrossRef]  

13. R. S. Al-Lashi, S. R. Gunn, E. G. Webb, and H. Czerski, “A novel high-resolution optical instrument for imaging oceanic bubbles,” IEEE J. Ocean. Eng. 43, 72–82 (2018). [CrossRef]  

14. R. W. Pascal, M. J. Yelland, M. A. Srokosz, B. I. Moat, E. M. Waugh, D. H. Comben, A. G. Cansdale, and M. C. Hartman, “A spar buoy for high-frequency wave measurements and detection of wave breaking in the open ocean,” J. Atmos. Ocean. Technol. 28, 590–605 (2011). [CrossRef]  

15. C. Lafon, J. Piazzola, P. Forget, and S. Despiau, “Whitecap coverage in coastal environment for steady and unsteady wave field conditions,” J. Mar. Syst. 66, 38–46 (2007). [CrossRef]  

16. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000). [CrossRef]  

17. R. C. Gonzalez and R. E. Woods, Digital Image Processing (Prentice-Hall, 2008).

18. M. S. Nixon and A. S. Aguado, Feature Extraction & Image Processing for Computer Vision (Newnes, 2002).

19. N. Otsu, “A threshold selection method from gray-level histograms,” IEEE Trans. Syst. Man Cybern. 9, 62–66 (1979). [CrossRef]  

20. S. Suzuki and K. Abe, “Topological structural analysis of digitized binary images by border following,” Comput. Vis. Graph. Image Process. 30, 32–46 (1985). [CrossRef]  

21. K. F. Riley, M. P. Hobson, and S. J. Bence, Mathematical Methods for Physics and Engineering (Cambridge University, 2010).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. Hardware components of the whitecap instrument. (a) shows the JAI camera and the single-board computer assembly inside the buoy dome. The single-board assembly in (b) consists of the following components: power management board, single-board computer, two solid-state drives, and the LCD display module.
Fig. 2.
Fig. 2. Spar buoy during deployment in the ocean. The length of the spar buoy is 11 m. The whitecap-imaging instrument discussed in this paper is fixed inside the spar buoy.
Fig. 3.
Fig. 3. Block diagram of the whitecap-imaging instrument architecture. The light lines correspond to the power transfer from the power management board to the other electronic components, the thick bold lines correspond to the data transfer from and to the single-board computer, and the broken lines represent the single-board computer control on other devices.
Fig. 4.
Fig. 4. Sample of the whitecap camera images, collected during deployment in the North Atlantic Ocean in 2013. The images in (a) and (b) contain whitecap patches. The images in (c) and (d) result from covering the buoy’s dome by the oceanic wave.
Fig. 5.
Fig. 5. Image correction by the calibration parameters. (a) shows the original distorted image before the calibration. The geometrical distortion is removed in (b) after performing the calibration.
Fig. 6.
Fig. 6. Whitecap extraction algorithm steps: (a) pre-filtering and adaptive thresholding, (b) contours identification, (c) after third post-filtering stage, and (d) after applying post-filtering operation. The red and green lines correspond to the identified contours and bounding rectangles, respectively.
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.