Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Detection of Burmese pythons in the near-infrared versus visible band

Open Access Open Access

Abstract

Human task performance studies are commonly used for detecting and identifying potential military threats. In this work, these principles are applied to detection of an environmental threat: the invasive Burmese python. A qualitative detection of Burmese pythons with a visible light camera and an 850 nm near-infrared (NIR) camera was performed in natural Florida backgrounds. The results showed that the difference in reflectivity between the pythons and native foliage was much greater in NIR, effectively circumventing the python’s natural camouflage in the visible band. In this work, a comparison of detection performance in the selected near-infrared band versus the visible band was conducted. Images of foliage backgrounds with and without a python were taken in each band in daylight and at night with illumination. Intensities of these images were then calibrated and prepared for a human perception test. Participants were tasked with detecting pythons, and the human perception data was used to compare performance between the bands. The results show that the enhanced contrast in the NIR enabled participants to detect pythons at 20% longer ranges than the use of visible imagery.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. INTRODUCTION

Human vision detection experiments are sometimes used for quantifying the difficulty of detecting potential targets with a particular sensing system. During such studies, human perception tests are conducted by tasking participants with detecting a target in a prepared set of imagery. The difficulty of the task depends on factors such as target-to-background contrast, clutter level in the scene, and range to the target. Military laboratories often implement these perception tests using targets such as armored vehicles, insurgents, and explosive devices, which inform design choices for the optimization of systems to assist in the field. While targets with military significance are common for human perception studies, this approach to sensing system optimization is more generally applicable and also extends to environmental threats such as invasive species.

The presence of Burmese pythons in the Florida Everglades National Park has been linked to fewer sightings of native fauna in the southern Florida region since the pythons’ introduction to the region in the late 1990s. A study by Dorcas et al. reported that sightings of common native species such as raccoons, opossums, and rabbits dropped by more than 90% in the immediate area by 2010 [1]. During an on-site study, our observations agreed with this finding. Over several hours, we spotted only one mammal and one avian in the area. There is also evidence of the northward expansion of the Burmese python population. If the pythons reach the northern part of the Florida peninsula, they may spread to other regions of the United States with similarly suitable climates, as far west as Texas and as far north as Virginia [2]. This invasive species has proven a serious threat to the local ecosystem, and it is imperative that as many pythons be found and removed as possible.

To date, there are no known formal studies for the detection of Burmese pythons with the aid of digital sensors in the visible to near-infrared spectrum (VisNIR). Methods for locating Burmese pythons are often limited to unaided search by human eye or by employing canine assistance. An informal study by the United States Department of Agriculture investigated the detection of Burmese pythons using thermal infrared cameras, which concluded that pythons that had been basking during the day were discernible against grass for hours after sunset due to slower temperature loss compared with their surroundings [3]. However, this approach is limited to conditions that cause thermal contrast between the cold-blooded snake and the background.

In a previous publication, the authors measured the VisNIR reflectivity spectra of Burmese pythons along with foliage native to the South Florida region and concluded that pythons provide more contrast versus backgrounds at wavelengths longer than 750 nm [4]. In this work, we designed and constructed a camera at 850 nm wavelength and used it to aid python detection. A preliminary qualitative test comparing this camera to the naked human eye on location in Everglades National Park was conducted with a living specimen in the environment. During the excursion, observers noted that Burmese pythons were more easily located with the NIR camera than with the human eye alone. In this work, we present a formalized survey comparing the same NIR camera with a red-green-blue (RGB) visible band camera in assisting human observers in finding a Burmese python against a foliage background.

 figure: Fig. 1.

Fig. 1. Detector sensitivities (solid lines) and reflectivities of foliage and Burmese python (dashed lines) with wavelength [46].

Download Full Size | PDF

2. SENSORS

Using the VisNIR hyperspectral measurements from [4], we selected a camera in the near-infrared band for detection of Burmese pythons. We also selected an RGB camera with similar sensitivity and resolution specifications for comparison. All components for both cameras were acquired from Edmund Optics. These cameras and accompanying illumination are described in this section.

The NIR camera was an EO-1312 NIR USB 3.0 CMOS camera, with a 16 mm C series VisNIR fixed focal length lens and an additional narrow bandpass filter centered at 850 nm with 50 nm full width half maximum attached [5]. The RGB camera was a USB 3 FLIR Blackfly S Color Camera with a 16 mm fixed focal length lens attached [6]. The NIR camera was operated using uEye Cockpit software, and the RGB camera was operated using SpinView software. These cameras were selected to nearly match their fields of view and $f$-numbers to provide comparable sensitivity and resolution, as well as magnification, as listed in Table 1.

Tables Icon

Table 1. NIR and Vis Camera Specifications [5,6]

 figure: Fig. 2.

Fig. 2. Mounted Vis and NIR cameras and illumination. Halogen lamps are mounted above the cameras, and NIR LED illuminators are mounted below.

Download Full Size | PDF

The Vis camera was operated in two modes. The first mode was color mode with R, G, and B channels, as shown in Fig. 1. The second mode was monochrome, where all of the channels were averaged to provide gray-scale images. The spectral response for the NIR camera is also shown in Fig. 1 but was multiplied by the 850 nm filter as shown. The reflectivities of the live foliage, dead foliage, and pythons are also plotted in Fig. 1 and are averages of the individual measurements provided in [4].

For all three bands in the RGB camera, the reflectivities of the pythons and backgrounds were similarly low, whereas significant contrast between python and background reflectivity was present in the selected NIR band against the foliage background.

For data collection at night, illumination in NIR and in Vis were used. NIR illumination was provided by two CMVision IR110 114 850 nm LED arrays. The arrays illuminated along a field of view of 30° and each of the 114 LEDs outputs 20 mW [7]. For Vis illumination, we used HDX 600 W portable halogen work lights, which emitted 12,000 lumens. Figure 2 shows all components mounted to a tripod, with the cameras positioned such that their fields of view were nearly identical, and illumination was mounted such that the field of view was always illuminated during nighttime image collection. This setup of the illuminators and cameras provided a near range point and far range point for operation during this experiment, at roughly 3–25 m range from the cameras.

Tables Icon

Table 2. Experimental Matrix of Imagesa

3. EXPERIMENT

A. Data Collection

In this initial experiment, we limited our backgrounds to a combination of living and dead foliage. This corresponded to the primary scenario seen by the python hunters in the Everglades. Other backgrounds to be studied at a later date were gravel, dirt, and sand. The foliage was primarily live grass with dead grass mixed in. The foliage scenario corresponded closely to the operations used by hunters working with the Florida Fish and Wildlife Commission when they drive along levees in the Florida Everglades and search in the grass along the sides of the gravel roads.

A summary of the images taken for this experiment is given by Table 2. One hundred 8-bit images were collected by each sensor at various locations, with half taken in daylight and half taken at night. Daytime images were taken without additional lighting, and nighttime images were taken with the Vis illuminator and NIR illuminator. At each location, four images containing a python placed somewhere in the field of view were taken with each sensor, and one image was taken with no python. The pythons used in this study were deceased to facilitate the frequent relocation of the python and to prevent movement blur during image collection. The reflectivity spectra of deceased pythons were determined to be indiscernible from those of living pythons in measurements taken for reference [4].

Two different deceased pythons were used during the data collection, but only one was present for any given image containing a python. Both pythons were provided by the Florida Fish and Wildlife Conservation Commission (FFWCC). To preserve the pythons between collections, they were kept frozen and then thawed for each collection. Each python was between 4 and 5 ft in length and weighed approximately 5 lb. The distance of the python from the cameras was determined using the relative size of ground truthing objects of known dimensions within the field of view of each camera. The location of the python ranged from 3 to 17 m from the cameras.

Following the image collection in the field, images were processed to normalize brightness distribution. To preserve the color values of the visible images, pixels were converted from RGB to hue-saturation-intensity (HSI) color space using the following formulas, delineated by Eqs. (1)–(4) [8]:

$$I = \frac{1}{3}({R + G + B} ),$$
$$S = 1 - \frac{{3\min ({R,G,B} )}}{{R + G + B}},$$
$$H = \left\{{\begin{array}{*{20}{c}}{\begin{array}{*{20}{c}}\theta \\{360^\circ - \theta}\end{array}}&{\begin{array}{*{20}{c}}{\rm if}\\{\rm if}\end{array}}&{\begin{array}{*{20}{c}}{B \le G}\\{B \gt G}\end{array}}\end{array}} \right.,$$
$$\theta = {\cos ^{- 1}}\left({\frac{{\frac{1}{2}\left[{({R - G} ) + ({R - B} )} \right]}}{{\sqrt {{{({R - G} )}^2} + ({R - B} )({G - B} )}}}} \right).$$

Intensity values for all Vis and NIR images were calibrated using linear histogram stretching, shifting the mean intensity of each image to the 50% intensity value. An additional set of 100 monochrome Vis images were saved from this process. Following intensity calibration, the color space of

 figure: Fig. 3.

Fig. 3. Sample images containing a python during daylight in (a) monochrome Vis, (b) RGB Vis, and (c) NIR as well as sample images during nighttime in (d) monochrome Vis, (e) RGB Vis, and (f) NIR. Red boxes indicate the location of a python.

Download Full Size | PDF

 figure: Fig. 4.

Fig. 4. Sample responses to a NIR image, evaluated as (a) true positive; (b), (c) false positive; (d) false negative; and (e) true negative. Red boxes indicate the location of a python but did not appear during perception tests.

Download Full Size | PDF

Vis images was converted back from HSI to RGB for display purposes using Eqs. (5)–(9) [8]:
$$\left\{{\begin{array}{*{20}{c}}{\begin{array}{*{20}{c}}{\varphi = 0^\circ ;}\\{\varphi = 120^\circ ;}\\{\varphi = 240^\circ ;}\end{array}}&{\begin{array}{*{20}{c}}{({x,y,z} ) = ({B,R,G} )}\\{({x,y,z} ) = ({R,G,B} )}\\{({x,y,z} ) = ({G,B,R} )}\end{array}}&{\rm if}&{\begin{array}{*{20}{c}}{H \in [{0^\circ ,120^\circ} ]}\\{H \in [{120^\circ ,240^\circ} ]}\\{H \in [{240^\circ ,360^\circ} ]}\end{array}}\end{array}} \right.,$$
$${H^\prime} = H - \varphi ,$$
$$x = I({1 - S} ),$$
$$y = I\left[{1 + \frac{{S\cos ({{H^\prime}} )}}{{\cos ({60^\circ - {H^\prime}} )}}} \right],$$
$$z = 1 - ({x + y} ).$$

B. Perception Test

To quantify the efficacy of the Vis and NIR bands for detection of Burmese pythons, we administered a human perception test to determine probability of detection as a function of range to target. There were 13 participants in total, the majority of whom were unfamiliar with the target prior to participating. Participants were trained on sample imagery from each category of images prior to the test to familiarize them with the task and with how the target would appear in Vis and in NIR. All three categories of images (Vis monochrome, Vis RGB, and NIR) were presented at random to each of

 figure: Fig. 5.

Fig. 5. True positive rate for images during day with a range of python from cameras.

Download Full Size | PDF

 figure: Fig. 6.

Fig. 6. True positive rate for images at night with a range of python from cameras.

Download Full Size | PDF

the human subjects as 8-bit images using MATLAB scripts. Figure 3 shows day and night example images for each category. During the day, natural illumination provided clear visibility across the entire field of view. At night, the illuminators provided adequate visibility of the scene to a range of 17 m from the cameras. The high reflectivity of the background foliage in NIR is prominent in both day and nighttime images, allowing for greater contrast with the python, as opposed to the low contrast in the Vis images in daylight. Each subject was tasked with determining whether a python was in each image. If the subject detected a python, the subject would then indicate where the python was detected by selecting a pixel on the python. The subjects were instructed to either click on the image location where they believed a python was present or click on a button at the bottom of the image to indicate that a python was not present.

The location of a python, if one was present in an image, was designated with a box of size ${{103}} \times {{208}}$ pixels, centered about a pixel located on the python. If the human subject selected a pixel within this box, then the subject’s response was considered a true positive. If the subject selected a pixel in the image that was not designated as belonging to a python, then the subject’s response was a false positive. If the subject did not detect a python in an image that contained a python, then the subject’s response was a false negative. If the subject did not detect a python in an image that did not contain a python, then the subject’s result was a true negative.

Examples of each type of response are shown in Fig. 4. Each image was presented with a button at the bottom with the words “No Python.” This was where subjects would click if they did not detect a python. Images in the figure possessing a red box contain a python, which is enclosed by the red box. All pixels within the red box were considered to be on the python. These red boxes did not appear during perception tests but appear here for clarity. Cursor arrows in the figure indicate the location where the user clicked for each scenario. A cursor within an image indicates a positive response, and a cursor on the “No Python” attachment indicates a negative response.

4. RESULTS

The true positive rate (TPR) was determined for each image containing a python by dividing the number of true positive responses by the total true positive and false negative responses made by human subjects. Overall averages for TPR in each band category for daylight and nighttime conditions are listed in Table 3. With daylight conditions, subjects performed well at detecting pythons in RGB visible images and in NIR images and poorly in monochrome visible images. Subjects also performed well with NIR images in nighttime conditions but performed significantly less well with images from both of the visible band categories.

Tables Icon

Table 3. True Positive Rate Averaged for All Images in Each Band

TPR with range of the python from the cameras was fitted to a target transfer probability function using the principles delineated by Vollmerhausen et al. [9]. These principles show a reciprocal relationship between target task performance (TTP) metric and range to target, converting the target transfer probability function using the TTP metric to the one shown in Eq. (10):

$${\rm TPR} = \frac{{{{\left({\frac{{{R_{50}}}}{R}} \right)}^E}}}{{1 + {{\left({\frac{{{R_{50}}}}{R}} \right)}^E}}},$$
where ${R_{50}}$ is the range at which TPR is 50%, and the exponent $E$ controls rate of decay of the function with range.

The fitted results are plotted in Figs. 5 and 6 with range bins of width 2.9 m chosen to evenly divide the set of ranges (the fitted values are listed in Table 4). Error bars represent 95% binomial confidence intervals. Daylight data show similar detection performance with the range of the python between RGB visible and NIR, with small reduction of performance with range to python. In contrast, performance in the monochrome visible is much lower than in the other two bands with increasing TPR as the python is placed further from the camera. At night, the 50% TPR range is reduced for all band categories, with performance in NIR and RGB visible being significantly higher than in monochrome visible. Performance in NIR is highest for both day and night, but the difference between NIR and RGB visible performance is more pronounced at night.

Tables Icon

Table 4. 50% TPR Range and Exponents from Fitting Performance Data to Eq. (10)

 figure: Fig. 7.

Fig. 7. Example images with glint on a python in the day for (a) monochrome Vis, (b) RGB Vis, and (c) NIR.

Download Full Size | PDF

5. DISCUSSION

Human subjects performed better at detecting pythons in NIR than in either of the two visible band categories. However, performance in NIR is only marginally better in daytime conditions, as opposed to nighttime with controlled illumination, for which the gap in performance is more pronounced. This is likely caused by solar glint on the python in daytime at the viewing angle of the cameras. While glint was more prominent in the images taken in Vis, images in NIR also had glint on the python. An example of glint in all three types of daytime images is shown in Fig. 7. Most of these images were taken in midafternoon, roughly two hours before sunset, with the cameras facing west. Under normal python search conditions, glint is likely to be beneficial in the Vis only for ${\pm}10^\circ$ toward the west, where all the other angles are similar to the nighttime results.

 figure: Fig. 8.

Fig. 8. True positive rate for images during day with a range of python from cameras, excluding glint.

Download Full Size | PDF

Tables Icon

Table 5. 50% TPR Range and Exponents from Fitting Daytime Performance Data to Eq. (10), Excluding Glint

Excluding images in all bands in which there is glint on the python, we fitted performance data to Eq. (10). The results with range bins of width 2.9 m are plotted in Fig. 8, and the fitted values for ${R_{50}}$ and $E$ are listed in Table 5. Removal of data with glint on the python greatly decreases ${R_{50}}$ for all band categories while proportionally widening the gap between NIR and the other two, especially monochrome Vis. This fit for monochrome Vis also makes more physical sense than the fit presented in Fig. 5; TPR now decreases with range to the python, and the camouflage of the python in Vis is pronounced.

With the daytime glint artifacts excluded, the range performance in daytime was better than at nighttime for both RGB Vis and NIR. The 50% detection range in NIR during the day was 1.4 times farther than at night, and 50% detection range in RGB Vis 1.3 times farther in daytime than at night. This indicates that illumination design or range-dependent contrast adjustment can further improve nighttime performance.

 figure: Fig. 9.

Fig. 9. NIR camera system mounted on the top of a vehicle.

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. Images of a living Burmese python in grass taken by (a) a vehicle-mounted NIR camera, and (b) a handheld RGB camera. Red boxes indicate the location of the python.

Download Full Size | PDF

6. CONCLUSION

Overall, results from this study show that the use of the NIR camera is advantageous for the detection of Burmese pythons over an RGB Vis or monochrome Vis camera, both during daylight and at night with illumination. Results for both day and night illumination conditions show that NIR-assisted participants in detecting pythons at a 50% success rate 1.2 times farther than RGB Vis at night and 1.3 times farther during the day. Detection performance for monochrome Vis fell short compared with the other two band categories, with an ${R_{50}}$ barely out to half the range of NIR detection at night and one fifth of NIR detection range during the day.

These results are in agreement with predictions made by the reflectivities of Burmese pythons in comparison with native foliage, as summarized by Fig. 1. Across the Vis band, the reflectivities provided lower contrast, as demonstrated by the poor detection performance in monochrome Vis. The RGB channels may have helped add color contrast against particularly green foliage, allowing better detection performance, but there was still very little difference between python and foliage reflectivities in comparison with those of the NIR band. The python’s lack of camouflage against native foliage in the NIR band allowed for better detection at longer ranges; thus, the use of the NIR band is valuable for aiding in detection of Burmese pythons in the Florida Everglades National Park.

Going forward, we will use this information to develop more robust NIR camera systems for field use in the Florida Everglades National Park to better search for and locate Burmese pythons. To do this, additional studies in time-limited human observer search performance will be conducted, using methods established by Friedman for detection of static targets using dynamic sensors [10].

Figure 9 is an example of a preliminary vehicle-mounted system that was used for an initial field test of the NIR camera. The system consisted of an NIR camera and accompanying NIR illuminator on a pan-tilt stand, a laptop, and a large display. Additional visible LED illumination was already included on the vehicle for search with the naked eye.

Figure 10 compares an image of a live python in a patch of tall grass taken with the vehicle-mounted system to an RGB image of the same scenario taken with a handheld camera. Both images were taken while the vehicle was stationary. During this test, the shading on the NIR image was inverted, and the gain was increased to saturate the surrounding foliage, allowing higher contrast to the exposed sections of the python, thus allowing for easy detection. In Vis RGB, however, the python was virtually undetectable due to its natural camouflage, even though it was less than 5 m from the camera. In future iterations of a vehicle-mounted NIR system, we will optimize the search area and image contrast on the system and also enable computer-aided detection of pythons from a moving vehicle. Training data for this effort will consist of imagery taken on site in the Everglades National Park.

Disclosures

The authors report no conflicts of interest in this work.

Data Availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

REFERENCES

1. M. E. Dorcas, J. D. Willson, R. N. Reed, R. W. Snow, M. R. Rochford, M. A. Miller, W. E. Meshaka, P. T. Andreadis, F. J. Mazzoti, C. M. Romagosa, and K. M. Hart, “Severe mammal declines coincide with proliferation of invasive Burmese pythons in Everglades National Park,” Proc. Natl. Acad. Sci. USA 109, 2418–2422 (2012). [CrossRef]  

2. G. H. Rodda, C. S. Jarnevich, and R. N. Reed, “What parts of the US mainland are climatically suitable for invasive alien pythons spreading from Everglades National Park?” Biol. Invasions 11, 241–252 (2009). [CrossRef]  

3. M. L. Avery, J. S. Humphrey, K. L. Kreacher, and W. E. Bruce, “Detection and removal of invasive Burmese pythons: methods development update,” in U.C. Agriculture & Natural Resources, Vertebrate Pest Conference (2014).

4. R. Driggers, O. Furxhi, G. Vaca, V. Reumers, M. Vazimali, R. Short, P. Agrawal, A. Lambrechts, W. Charle, K. Vunckx, and C. Arvidson, “Burmese python target reflectivity compared to natural Florida foliage background reflectivity,” Appl. Opt. 58, D98–D104 (2019). [CrossRef]  

5. “EO-1312 NIR USB 3.0 Camera,” Edmund Optics, 2020 [Online]. Available: https://www.edmundoptics.com/p/eo-1312-nir-usb-30-camera/37395/.

6. “BFS-U3-13Y3C-C USB 3.1 Blackfly S, Color Camera,” Edmund Optics, 2020 [Online]. Available: https://www.edmundoptics.com/p/bfs-u3-13y3c-c-usb3-blackfly-reg-s-color-camera/37236/.

7. “CMVision CM-IR110 114 LEDS 200-300ft long range IR illuminator,” CMVision, 2020 [Online]. Available: http://www.cmvision.co/products/cmvision-cm-ir110-114-leds-200-300ft-long-range-ir-illuminator.html.

8. G. Sravanan, G. Yamuna, and S. Nandhini, “Real time implementation of RGB to HSV/HSI/HSL and its reverse color space models,” in IEEE International Conference on Communication and Signal Processing, Melmaruvathur, India, 2016.

9. R. Vollmerhausen, E. Jacobs, and R. Driggers, “New metric for target acquisition performance,” Opt. Eng.43 (2004), doi:10.1117/1.1799111. [CrossRef]  

10. M. Friedman and S. Moyer, “Initial verification of an analytical model for target acquisition of multiple moving targets using multiple moving imaging sensors,” U.S. Army, NVESD, Fort Belvoir, Virginia, 2014.

Data Availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1.
Fig. 1. Detector sensitivities (solid lines) and reflectivities of foliage and Burmese python (dashed lines) with wavelength [46].
Fig. 2.
Fig. 2. Mounted Vis and NIR cameras and illumination. Halogen lamps are mounted above the cameras, and NIR LED illuminators are mounted below.
Fig. 3.
Fig. 3. Sample images containing a python during daylight in (a) monochrome Vis, (b) RGB Vis, and (c) NIR as well as sample images during nighttime in (d) monochrome Vis, (e) RGB Vis, and (f) NIR. Red boxes indicate the location of a python.
Fig. 4.
Fig. 4. Sample responses to a NIR image, evaluated as (a) true positive; (b), (c) false positive; (d) false negative; and (e) true negative. Red boxes indicate the location of a python but did not appear during perception tests.
Fig. 5.
Fig. 5. True positive rate for images during day with a range of python from cameras.
Fig. 6.
Fig. 6. True positive rate for images at night with a range of python from cameras.
Fig. 7.
Fig. 7. Example images with glint on a python in the day for (a) monochrome Vis, (b) RGB Vis, and (c) NIR.
Fig. 8.
Fig. 8. True positive rate for images during day with a range of python from cameras, excluding glint.
Fig. 9.
Fig. 9. NIR camera system mounted on the top of a vehicle.
Fig. 10.
Fig. 10. Images of a living Burmese python in grass taken by (a) a vehicle-mounted NIR camera, and (b) a handheld RGB camera. Red boxes indicate the location of the python.

Tables (5)

Tables Icon

Table 1. NIR and Vis Camera Specifications [5,6]

Tables Icon

Table 2. Experimental Matrix of Imagesa

Tables Icon

Table 3. True Positive Rate Averaged for All Images in Each Band

Tables Icon

Table 4. 50% TPR Range and Exponents from Fitting Performance Data to Eq. (10)

Tables Icon

Table 5. 50% TPR Range and Exponents from Fitting Daytime Performance Data to Eq. (10), Excluding Glint

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

$$I = \frac{1}{3}({R + G + B} ),$$
$$S = 1 - \frac{{3\min ({R,G,B} )}}{{R + G + B}},$$
$$H = \left\{{\begin{array}{*{20}{c}}{\begin{array}{*{20}{c}}\theta \\{360^\circ - \theta}\end{array}}&{\begin{array}{*{20}{c}}{\rm if}\\{\rm if}\end{array}}&{\begin{array}{*{20}{c}}{B \le G}\\{B \gt G}\end{array}}\end{array}} \right.,$$
$$\theta = {\cos ^{- 1}}\left({\frac{{\frac{1}{2}\left[{({R - G} ) + ({R - B} )} \right]}}{{\sqrt {{{({R - G} )}^2} + ({R - B} )({G - B} )}}}} \right).$$
$$\left\{{\begin{array}{*{20}{c}}{\begin{array}{*{20}{c}}{\varphi = 0^\circ ;}\\{\varphi = 120^\circ ;}\\{\varphi = 240^\circ ;}\end{array}}&{\begin{array}{*{20}{c}}{({x,y,z} ) = ({B,R,G} )}\\{({x,y,z} ) = ({R,G,B} )}\\{({x,y,z} ) = ({G,B,R} )}\end{array}}&{\rm if}&{\begin{array}{*{20}{c}}{H \in [{0^\circ ,120^\circ} ]}\\{H \in [{120^\circ ,240^\circ} ]}\\{H \in [{240^\circ ,360^\circ} ]}\end{array}}\end{array}} \right.,$$
$${H^\prime} = H - \varphi ,$$
$$x = I({1 - S} ),$$
$$y = I\left[{1 + \frac{{S\cos ({{H^\prime}} )}}{{\cos ({60^\circ - {H^\prime}} )}}} \right],$$
$$z = 1 - ({x + y} ).$$
$${\rm TPR} = \frac{{{{\left({\frac{{{R_{50}}}}{R}} \right)}^E}}}{{1 + {{\left({\frac{{{R_{50}}}}{R}} \right)}^E}}},$$
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.