Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Visible light communication using dual camera on one smartphone

Open Access Open Access

Abstract

Dual camera is becoming increasingly prevalent among smartphone camera schemes these days. This paper demonstrates a system prototype by using the color and monochrome cameras on one smartphone simultaneously for visible light communication. To achieve this, we propose a novel dual-modulation scheme. The baseband signal is firstly modulated by color ratio modulation-color shift keying (CRM-CSK) to broadcast color ratio information that can be distinguished by the color camera. Next to it, gray level modulation (GLM) is utilized to generate CRM symbols with gray levels that can be distinguished by the monochrome camera. Our experiment shows a significant improvement in the downlink data rate of the optical camera communication (OCC) using a single light source.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

With continuous demands on energy saving and environment protection, solid state lighting (SSL) with light-emitting-diodes (LEDs) is replacing conventional mercury based lamps gradually. In addition of the high quality lighting brought by the LED device, its high bandwidth modulation characteristics also makes visible light communication (VLC) become realistic [1]. VLC, as the name indicates, uses the high-speed modulated LED for short-range data transmission. The wireless optical signal can be detected by the photodiode (PD) with a trans-impedance amplifier (TIA) [1–4]. However, using a PD sensor is challenging in current VLC applications since a well-designed TIA circuit is complex and costly [5,8]. On the other hand, the PD sensor is also difficult to be seamlessly integrated in smart devices such as mobile phones, computers and televisions [7]. As the integrated chip of VLC is still on the way to market, investigators are pursuing a universal way for receiving the wireless optical signal [6]. In [7], smartphone cameras are proposed to accomplish such a task. It is believed that the cameras used in smartphones are manufactured by using CMOS technology, in which the sensor pixels adopt a progressive scanning imaging method. In progressive scanning operating mode, pixels in the cameras sensing area are activated row by row to finish scanning the whole image. Such a characteristic makes it possible for detecting wireless optical signal. In the image captured by a CMOS camera, each row represents a one-dimensional time series, in which the sampling time is defined by the column pixel scanning time. Generally speaking, column pixel scanning time is critically short, if the switching frequency of a light source is much lower than the column pixel scanning frequency, the switching status of the light source will be recorded as the “bar” pattern. Researchers are used to call it as the “rolling shutter pattern” of the CMOS camera [9–11].

Unfortunately, the CMOS camera receiving scheme is still facing some issues [12–14]. The CMOS cameras in the commercial smartphones have the frame rate of 20 to 60 frame-per-second (fps) [15]. Such a low frame rate will limit the maximum data rate that a CMOS camera can deliver. To overcome this issue, the researchers have proposed several strengthening techniques. In [12], the authors used a red-green-blue (RGB) LED to broadcast the signal modulated by RGB-wave division multiple (RGB-WDM), in which the data rate has been increased to 2.88 kbit/s using a CMOS camera of 30fps. A multi-level gray scale scheme has been proposed in [10], in which two white LEDs overlapped together to transmit signal with the brightness of three levels. With a low-pass filter, the data rate was increased to 4.32 kbit/s when the CMOS camera is operating in the frame rate of 60fps. A quadrichromatic LED was utilized in [16], in which color ratio modulation-color shift keying (CRM-CSK) was proposed to improve the data rate and enhance the illumination effect at the same time, the data rate was increased to 13.2 kbit/s with modern illumination constraints (color temperature and color rendering index). A multi input multi output (MIMO) scheme is proposed in [17], where MIMO-RGB-WDM modulation is used to improve the data rate by using an RGB LED.

The improved schemes introduced above all adopt the channel model of one-optical-source to one-camera. Currently, with the advancement of image sensors, photos of good imaging quality are pursed by more and more people. Hence, a smart phone with dual-camera is becoming increasingly popular. The dual-camera is usually composed of an RGB color camera and a monochrome camera. These two cameras contribute their own privileges to form the photos of exquisite details. In this paper, we demonstrate a system prototype by using the dual-camera on one smartphone for receiving the dual-modulation signal. The base-band digital signal is modulated by CRM-CSK scheme to bear “color” information that can be distinguished by the RGB image sensor. Followed that, the GLM is used to modify the “brightness” information of each CRM-CSK symbol, which can be distinguished by the monochrome camera. Finally, a data rate test is carried out to examine the system performance.

2. System scheme

2.1 Channel models of RGB camera and monochrome camera

Dual camera is used more and more widely in the field of smartphone imaging. The dual camera is usually composed of a RGB color camera and a monochrome camera. As shown in Fig. 1(b), a camera consists of several components. The camera aperture size and exposure time will determine the luminous flux of light that the camera is exposed to during scanning. For the RGB color camera, the dense pixels on the image sensor are covered by optical filters with the Bayer-pattern to produce response photocurrent. Followed by a series of imaging algorithm, the output image is composed of the image pixels with three attributes, namely the color intensity in R, G and B sub-channel. Denote the ith pixel column as Pi, which can be expressed as:

Pi=[pi,1pi,2pi,j]T,
where Pi,j denotes the image pixels in column i and row j. We use r, g and b to represent the color intensity captured by the camera in red, green and blue channel, which can be express as:
pi,j=[ri,jgi,jbi,j],
The image captured by the RGB color image sensor can be represented as:
IRGB=[P1P2PN]=[p1,1p2,1pN,1p1,2p2,2pN,2pi,jp1,Kp2,KpN,K],
where the image resolution is given by K×N. The optical source is a commercial multicolor LED with the primary colors, R, G and B, which is shown in Fig. 1(a). We further assume the normalized spectral power distribution is SLED and the normalized spectral response of the RGB image sensor is RRGB:
SLED=[R(λ)G(λ)B(λ)],RRGB=[r(λ)g(λ)b(λ)],
where λ denotes the wavelength. The color intensity (CI) information on pixel pi,j can be expressed as the following equation:
pi,j=255×0σdt380nm780nmη(i,j,d)δRRGBI(ϕ)kDimmingSLED1dλ.
In Eq. (5), σ and δ denotes the exposure time and aperture size respectively, which are dominated by the sensor software. η(i,j,d) denotes the light intensity distribution on the pixel pi,j with the distance d between the RGB camera and light source. ϕ is the total luminous flux emitted from the RGB-LED. Function I() describes the nonlinear characteristic of the camera. And kDimming is the dimming vector that describes the dimming results of the RGB LED, which can be written as:
kDimming=[kRkGkB]T.
As for the monochrome camera, since there is no optical filter, the monochrome camera can only resolve the gray scale information. For a given dimming vector kDimming, the gray level information on pixel gi,j can be express as:
gi,j=255×0σdt380nm780nmη(i,j,d)δRM(λ)I(ϕ)kDimmingSLED1dλ,
where RM(λ) is the spectral response of the monochrome camera. Similarly, we use IGray to represent the gray-scale image that captured by the monochrome camera:
IGray=[G1G2GN]=[g1,1g2,1gN,1g1,2g2,2gN,2gi,jg1,Kg2,KgN,K],
where GNdenotes the ith column pixels and gi,j denotes the pixels on row j and column i.

 figure: Fig. 1

Fig. 1 System configuration of using the dual-camera on the smartphones as the visible light communication receiver. (a) System prototype. (b) Imaging process of the RGB camera and monochrome camera.

Download Full Size | PDF

2.2 CRM and GLM dual modulation scheme

As Eq. (5) indicates, the color intensity of sub-channel can vary along with communication distance and light intensity distribution of the optical signal source. To overcome this problem, the color constellation points are layout in the color ratio space [16]. A color ratio symbol can be obtained by normalizing a given RGB pixel, which is:

C=pi,jpi,j1=[ri,jri,j+gi,j+bi,jgi,jri,j+gi,j+bi,jbi,jri,j+gi,j+bi,j].
In CRM-CSK, the RGB LED is operating in the fast switching status between the designed color ratio symbols. To improve the noise tolerance, minimum Euclidean distance (MED) between pairwise symbols in received signal space is the objective function to be optimized, which is:
argmax{k1,k2,,kn}minCiCj2,(ij;Ci,CjCRM).
In Eq. (10), n denotes the number of CRM-CSK symbols. By solving this optimization problem, we can obtain the collection of CRM-CSK symbols and its dimming reference respectively:
CRM={C1,C2,...Cn},kCRM={k1,k2,,kn}.
Noted that, the element in collection kCRM is normalized, altering total luminous flux ϕwill not affect the CRM results. However, by modulating the total luminous flux of each CRM symbol, the light intensity can be perceived by the monochrome camera. Hence, for the gray level modulation, total luminous flux of each CRM-CSK symbol in collection CRM needs to be modulated to carry gray level information. The optimal design for a given CRM symbol Ck can be obtained by the following equation:
argmax{ϕ1,ϕ2,...,ϕm}minGiGj1,(ij;Gi,Gj(GLM{0})),
where the collection of GLM level for the CRM symbol Ck and its dimming reference can be obtained respectively:
GLM={G1,G2,Gm},kGLM(Ck)={ϕ1,ϕ2,...,ϕm}.
Figure 2 schematically shows the modulation procedure. Continuous bit strings with the length of log2n+log2m are extracted from the bit stream to be sent. By group mapping, the string is mapped to a CRM constellation Cx(x=1,2,...,n)in constellation sets CRM and a GLM constellation Gy(y=1,2,...,m) in GLMsuccessively by referring to kCRMand kGLM(Ck), where CRM and GLM are performed to the RGB LED during a symbol duration.

 figure: Fig. 2

Fig. 2 CRM and GLM dual-modulation scheme for the RGB LED transmitter.

Download Full Size | PDF

2.3 Data frame structure and illumination support

As the diagrammatic sketch of camera scanning process shown in Fig. 3, image frames on smartphone platform are usually not stable due to the task scheduling performed by operating system. During the successive two image frames, there exists a processing time. However, depend on the running context; processing time can vary in a range, which is unpredictable. Hence, it is not possible to record the whole LED transmitting process. To increase system reliability, several measures have been taken. First, each data frame should be transmitted three times in a whole image frame to confirm that a data frame can be completely recorded. However, due to the unstable processing time, if the smartphone system is facing a heavy working load, such a time interval can be greatly increased. Hence, the receiver should have the ability to know the frame fault. To solve this, for each data frame, a frame descriptor is attached to the payload, which tells the frame index. On the other hand, since the LED is working in circulatory transmission mode, if data frame fault happens, the receiver can also extract the missing frame from the LED in the next transmission cycle. Another problem needs to be solved by the data frame structure is the illumination constraints. Two basic lighting functions, brightness and color, are realized by a “light trimming” method. As shown in the Fig. 3, illumination support symbols are inserted at the end of data frame. The main idea of this method is based on the time mixing method of color. During a data frame duration, the final light spectrum is the time integral of all symbols [16]. We use three illumination support symbols to adjust the RGB ratio dynamically to keep the time integral of each data frame in consistency. It is worth noting that, although the dimming range for a LED is limited, by extending the time duration of the illumination support symbols, the device always has the ability to keep color constant.

 figure: Fig. 3

Fig. 3 Data frame structure to handle the “processing time” and illumination support.

Download Full Size | PDF

2.4 Experiment platform and pre-experiment

Table 1 gives the material list and several critical camera operating parameters. The experiment platform is shown in Fig. 4, which can be divided into two parts, namely the TX and RX. A RGB LED spotlight is built for the optical light source, in which the RGB-LED is controlled by the MCU through a DAC-analog dimming circuit via serial peripheral interface (SPI) bus. The dimming circuit consists of three LED dimmers (PT4115) that manage the driving current of the three-color LEDs respectively. The LED is cover with a diffuser which can reduce the uneven distribution of light and mitigate the “blooming effect” of image. The vertical distance between the image seniors and the light source is set as 3cm to make sure that the rolling shutter pattern is fully occupied the image frame.

Tables Icon

Table 1. System parameters.

 figure: Fig. 4

Fig. 4 Experiment platform.

Download Full Size | PDF

An integral sphere (HASS2000) is utilized for measuring the optoelectronic parameters of the RGB LED. The normalized SPD of the RGB LED and the spectrum response of the dual camera are given in Fig. 5.

 figure: Fig. 5

Fig. 5 Normalize SPD of the RGB LED and the spectrum response of the dual-camera.

Download Full Size | PDF

The size of constellation set should be the integral power of 2 to ensure a communication symbol can bear integral bits. However, as shown in Fig. 5, the impact of channel crosstalk limits the maximum size of constellations. In our experiment, we found that 16 and 4 are the maximum sizes for CRM-CSK and GLM constellation set respectively to balance both data rate and bit error rate. Solving the optimization problem in Eq. (10) by using MATLAB software, the CRM-CSK constellation set with the size of four, eight and sixteen is shown in Fig. 6. For the GLM constellations, it is obvious that they should be placed on the equidistant points (except zero) in the one-dimensional gray-level space (line) to meet the maximum pairwise MED criterion.

 figure: Fig. 6

Fig. 6 (a) Optimum layout of four, eight and sixteen CRM-CSK constellations in two-dimensional color ratio space. (b) Optimum layout of two and four GLM constellations in one-dimensional gray-level space.

Download Full Size | PDF

For the channel model of monochrome camera, the function I(ϕ) in Eq. (7) describes the nonlinear effect in light intensity detection. As obtaining the complete channel characteristic is tedious, we fit the spectrum response curve of monochrome camera for RGB LED as the prior knowledge for dimming reference. It is realized by carefully alternating the driving current of RGB-LED with the current step ΔI, which is a simple task by using the integral sphere equipment. The results are shown in the Fig. 7. We divided the gray level response curve to three zones. In “dead zone”, since the light intensity is too weak to produce photocurrent, gray level in captured gray level image is not sensitive to the variance of light intensity. In addition, at the “saturation zone”, the response curve shows the same trend. The working zone can be adjusted by several parameters such as the communication distance, aperture size, exposure time, ISO and luminous flux of the LED. In the communication system, a commercial mobile phone with the dual camera acts as an optical signal receiver. Generally, the RGB camera and monochrome camera have the different column pixel scanning frequency, hence, the signal frequency need to be low enough to make sure that optical signal can be successfully recorded by the two cameras. In our experiment, we confirm that 5~10 pixels for RGB camera per symbol (including the “black” gap between two adjacent symbols, which provides synchronization signal) and 10~20 pixels per symbol for the monochrome camera are the best choice to balance data rate and system stability. Moreover, appropriate working parameters for the dual camera also plays a vital role in the good image quality. In OCC system, camera exposure time and camera light intake are vital parameters that can affect the imagining process. As the camera model equation established in Eq. (5) and Eq. (7), camera exposure time have the following two impacts on the final image, namely image brightness and signal sampling rate. The exposure time determines the integral time of the pixels on each column, hence affecting the image brightness. In addition, since camera exposure also takes time, a longer exposure time also means a lower sampling rate. Once the exposure time is fixed, several parameters directly determine camera light intake such as the aperture size. However, in smartphone cameras, the aperture size is usually fixed, and the users are not able to tune it with camera software. However, another tunable camera parameter called ISO is provided to users for adjusting light exposure. ISO usually refers to the photosensitivity of a camera device. For the RGB camera, we tend to adopt a short exposure time and lower ISO setting to prevent “overflowing”. “Overflowing” refers to the phenomena that light intensity exceeds the range that the camera digital register can provide. For the monochrome camera, we adopt a short exposure time but larger ISO value to improve the contrast ratio (working in the linear zone).

 figure: Fig. 7

Fig. 7 Gray-level tuning reference for the monochrome camera.

Download Full Size | PDF

2.5 Demodulation

The demodulation algorithm can be divided into two separate parts to handle the “color” information and gray level information respectively. Android phones with the system version higher than 8.0 can obtain the raw image directly without experiencing the image post-processing stage applied by the camera software. As image post-processing can alter the relative ratio of RGB elements in the captured images, decoding algorithm for the RGB camera should be performed in the raw image to ensure the best bit error ratio (BER) performance. This principle is also applicable to the monochrome camera. The demodulation procedure for the RGB image is shown in the Fig. 8(a). Multi-threading is used to realize the concurrent image processing and improve the system real-time ability. In each image processing thread, image frames are extracted firstly from the corresponding camera to create a continuous image stream. Such a process is managed by the dedicated hardware, which controls the camera hardware interface. Image acquisition usually takes time, once a full image frame is captured, hardware interruption will be triggered to notify the camera user that an image frame needs to be processed. During the time slot between two image frames, decoding algorithm should be performed. Algorithm for decoding image frames is started with row pixels extraction, where the one-dimensional time series from RGB image stream and gray-scale image can be obtained respectively. Research [15–17] shows that the light intensity distribution function η(i,j,d) of “rolling shutter” pattern is a second-order exponential function of i and j. Hence extracting the middle row pixels can ensure higher contrast ratio. We define the one-dimensional time series extracted from RGB color image and gray-scale image as:

tRGB=[IRGB]:,K/2=[p1,K/2p2,K/2pN,K/2]tGray=[IGray]:,K/2=[g1,K/2g2,K/2gN,K/2],
where [·]x,ydenotes the element of xth column and yth row in a matrix. To obtain the edge position of the signal, forward difference algorithm [18] is applied to the one-dimensional time series defined in Eq. (14), which can be expressed as:
[tRGB']i=[tRGB]i+11[tRGB]i1,(1iN1)[tRGB']i=[tGray]i+1[tGray]i,(1iN1),
where N is the sequence length of tRGB and tGray. The edge of “bar pattern” can be sought by using the function “findpeaks” defined in MATLAB software, in which the return value is the local maximum of function and the corresponding independent variable. We use the vector emin and emaxto record the pixel index where local maximum and local maximum appears:
emax(RGB)=findpeaks(tRGB'),emin(RGB)=findpeaks(tRGB')emax(Gray)=findpeaks(tGray'),emin(RGB)=findpeaks(tGray').
It is obvious that [emax]iand [emin]i tells the start and end position of the ith “bar” pattern in the image frame respectively. Since a random binary sequence generator is utilized as the data source, the whole system can be treated as a channel with equally likely input symbols. We will further assume the system noise obey Gaussian distribution. According to the optimum reception theory, the optimal scheme of symbol detecting is the maximum likelihood (ML) detector. Hence the demapping criterion of CRM and GLM symbols can be represented by:
argminCCRMi=startendCpK/2,i2
and
argminGGLM(C)i=startendGgK/2,i2
respectively. Noted that, the images streams from the two cameras are not synchronous, data merging needs referring to the frame descriptor.

 figure: Fig. 8

Fig. 8 Diagram of receiving visible light signal using the dual-camera on one smartphone (a) Demodulation procedure. (b) Algorithm.

Download Full Size | PDF

3. Experiment

In the experiment design, random binary data generated by MATLAB software is loaded into the MCU in advance. Transmission will be lasted for 2 minutes a time to obtain transmission results. We investigate data rate and BER performance under different illumination level. The transmission experiment was done six times by using the modulation scheme of six levels, which are 4-CRM-CSK&2-GLM, 8-CRM-CSK&2-GLM, 16-CRM-CSK&2-GLM, 4-CRM-CSK&4-GLM, 8-CRM-CSK&4-GLM and 16-CRM-CSK&4-GLM respectively. The results are shown in Fig. 9(a). We further give the average BER curve of CRM-CSK and GLM under a certain modulation level, which are shown in Fig. 9(b) and Fig. 9(c) respectively. It is obvious that the BER curve for the RGB camera and monochrome camera shows the diametrically opposite trend along with the increasing of luminance. For the RGB camera, as the binary information is bear in the color ratio carried by the communication symbols, higher luminance may lead to pixel overflow. Consequently, clock synchronization failure resulted by adjacent pixel overflow can occur. On the other hand, overflowed pixels will lose its color ratio information. An example of “overflowed” pattern is shown in Fig. 10(a). However, for the monochrome camera, things will be different. For the demodulation of GLM signal, the absolute gray level information is the only criterion, however, the light intensity distribution is not even due to the “blooming effect” caused by the LED directivity. As show in Fig. 10(b), the pattern comparison under two luminance; the image edge is not clear under a low luminance compared with a higher one. In this case, the edge pixels work in the dead zone, where the pixels are not sensitive to the light intensity. To improve this, a higher luminance can bring a higher contrast ratio, which can mitigate the intensity attenuation from “blooming effect”.

 figure: Fig. 9

Fig. 9 (a) BER curves of different modulation level under different illuminate level. (b) Average BER of 4, 8 and 16-CRM-CSK under different illuminate level in the transmission experiment. (c) Average BER of 2 and 4-GLM under different illuminate level in the transmission experiment.

Download Full Size | PDF

 figure: Fig. 10

Fig. 10 (a) In captured RGB image, higher luminance may lead to charge overflow. (b) In captured gray-scale image, higher luminance can result in higher contrast ratio, which mitigate the “blooming effect” caused by uneven light.

Download Full Size | PDF

Table 2 shows the real-time data rate in the experiment. To our knowledge, in the single LED OCC system, our dual modulation scheme can achieve the highest data rate (Table 3).

Tables Icon

Table 2. Data rate of different modulation level

Tables Icon

Table 3. Date rate comparison of existing single-LED OCC system

4. Conclusion

We propose the use of dual-camera for the first time in receiving wireless visible light signals. As the dual-camera is consists of a RGB image sensor and a monochrome image sensor, CRM and GLM are applied simultaneously to the communication symbols. At the receiver, the CRM and GLM signal can be demodulated independently from image streams of RGB camera and monochrome camera respectively. Experiment shows that the real-time downlink data rate can reach 17.1 kbit/s at most with a single RGB LED light source using 16-CRM-CSK and 4-GLM. However, since the experiment is carried out with the assistance of a pre-experiment to adjust the dual-camera parameters, dedicated adaptive algorithm should be researched in the future.

Funding

National Key Research and Development Program of China (2016YFB0400103); GDAS' Project of Science and Technology Development (2017GDASCX-0703, 2017GDASCX-0862, 2017GDASCX-0112, 2018GDASCX-0112); Industry-University-Research Cooperation Innovation Major project of Guangzhou (201604046027); Key Projects of Guangdong (2017A070701025); Pearl River Nova Program of Guangzhou (201806010087); The Technical transfer project of Zhongshan and Guangdong Academy of Sciences (2016G1FC0012, 2017G1FC0005, 2017G1FC0010).

References

1. S. Rajagopal, R. D. Roberts, and S. K. Lim, “IEEE 802.15.7 visible light communication: modulation schemes and dimming support,” IEEE Commun. Mag. 50(3), 72–82 (2012). [CrossRef]  

2. A. Jovicic, J. Li, and T. Richardson, “Visible light communication: opportunities, challenges and the path to market,” IEEE Commun. Mag. 51(12), 26–32 (2013). [CrossRef]  

3. L. Grobe, A. Paraskevopoulos, J. Hilt, D. Schulz, F. Lassak, F. Hartlieb, C. Kottke, V. Jungnickel, and K.-D. Langer, “High-speed visible light communication systems,” IEEE Commun. Mag. 51(12), 60–66 (2013). [CrossRef]  

4. T. Komine and M. Nakagawa, “Integrated system of white LED visible-light communication and power-line communication,” IEEE Trans. Consum. Electron. 49(1), 71–79 (2003). [CrossRef]  

5. O. Ergul, E. Dinc, and O. B. Akan, “Communicate to illuminate: State-of-the-art and research challenges for visible light communications,” Phys. Commun. 17, 72–85 (2015). [CrossRef]  

6. S. Rajbhandari, H. Chun, G. Faulkner, K. Cameron, A. V. N. Jalajakumari, R. Henderson, D. Tsonev, M. Ijaz, Z. Chen, H. Haas, E. Xie, J. J. D. McKendry, J. Herrnsdorf, E. Gu, M. D. Dawson, and D. O’Brien, “High-speed integrated visible light communication system: device constraints and design considerations,” IEEE J. Sel. Areas Comm. 33(9), 1750–1757 (2015). [CrossRef]  

7. I. Takai, S. Ito, K. Yasutomi, K. Kagawa, M. Andoh, and S. Kawahito, “LED and CMOS image sensor based optical wireless communication system for automotive applications,” IEEE Photonics J. 5(5), 6801418 (2013). [CrossRef]  

8. L. Zhang, D. Chitnis, H. Chun, S. Rajbhandari, G. Faulkner, D. O’Brien, and S. Collins, “A comparison of APD and SPAD based receivers for visible light communications,” J. Lightwave Technol. 36(12), 2435–2442 (2018). [CrossRef]  

9. C. W. Chen, C. W. Chow, Y. Liu, and C. H. Yeh, “Efficient demodulation scheme for rolling-shutter-patterning of CMOS image sensor based visible light communications,” Opt. Express 25(20), 24362–24367 (2017). [CrossRef]   [PubMed]  

10. J. Shi, J. He, J. He, R. Deng, Y. Wei, F. Long, Y. Cheng, and L. Chen, “Multilevel modulation scheme using the overlapping of two light sources for visible light communication with mobile phone camera,” Opt. Express 25(14), 15905–15912 (2017). [CrossRef]   [PubMed]  

11. H. C. N. Premachandra, T. Yendo, and M. P. Tehrani, “High-speed-camera image processing based LED traffic light detection for road-to-vehicle visible light communication,” Intelligent Vehicles Symposium, 793–798 (2010). [CrossRef]  

12. S. H. Chen and C. W. Chow, “Color-filter-free spatial visible light communication using RGB-LED and mobile-phone camera,” Opt. Express 22(25), 30713–30718 (2014). [CrossRef]   [PubMed]  

13. S. H. Chen and C. W. Chow, “Color-shift keying and code-division multiple-access transmission for RGB-LED visible light communications using mobile phone camera,” IEEE Photonics J. 6(6), 1–6 (2017).

14. G. Corbellini, K. Aksit, S. Schmid, S. Mangold, and T. R. Gross, “Connecting networks of toys and smartphones with visible light communication,” IEEE Commun. Mag. 52(7), 72–78 (2014). [CrossRef]  

15. C. W. Chow, C. Y. Chen, and S. H. Chen, “Visible light communication using mobile-phone camera with data rate higher than frame rate,” Opt. Express 23(20), 26080–26085 (2015). [CrossRef]   [PubMed]  

16. H. Chen, X. Z. Lai, P. Chen, Y. T. Liu, M. Y. Yu, Z. H. Liu, and Z. J. Zhu, “Quadrichromatic LED based mobile phone camera visible light communication,” Opt. Express 26(13), 17132–17144 (2018). [CrossRef]   [PubMed]  

17. K. Liang, C. W. Chow, and Y. Liu, “RGB visible light communication using mobile-phone camera and multi-input multi-output,” Opt. Express 24(9), 9383–9388 (2016). [CrossRef]   [PubMed]  

18. Y. Liu, “Decoding mobile-phone image sensor rolling shutter effect for visible light communications,” Opt. Eng. 55(1), 016103 (2016). [CrossRef]  

19. T. H. Do and M. Yoo, “Performance Analysis of Visible Light Communication Using CMOS Sensors,” Sensors (Basel) 16(3), 309 (2016). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1 System configuration of using the dual-camera on the smartphones as the visible light communication receiver. (a) System prototype. (b) Imaging process of the RGB camera and monochrome camera.
Fig. 2
Fig. 2 CRM and GLM dual-modulation scheme for the RGB LED transmitter.
Fig. 3
Fig. 3 Data frame structure to handle the “processing time” and illumination support.
Fig. 4
Fig. 4 Experiment platform.
Fig. 5
Fig. 5 Normalize SPD of the RGB LED and the spectrum response of the dual-camera.
Fig. 6
Fig. 6 (a) Optimum layout of four, eight and sixteen CRM-CSK constellations in two-dimensional color ratio space. (b) Optimum layout of two and four GLM constellations in one-dimensional gray-level space.
Fig. 7
Fig. 7 Gray-level tuning reference for the monochrome camera.
Fig. 8
Fig. 8 Diagram of receiving visible light signal using the dual-camera on one smartphone (a) Demodulation procedure. (b) Algorithm.
Fig. 9
Fig. 9 (a) BER curves of different modulation level under different illuminate level. (b) Average BER of 4, 8 and 16-CRM-CSK under different illuminate level in the transmission experiment. (c) Average BER of 2 and 4-GLM under different illuminate level in the transmission experiment.
Fig. 10
Fig. 10 (a) In captured RGB image, higher luminance may lead to charge overflow. (b) In captured gray-scale image, higher luminance can result in higher contrast ratio, which mitigate the “blooming effect” caused by uneven light.

Tables (3)

Tables Icon

Table 1 System parameters.

Tables Icon

Table 2 Data rate of different modulation level

Tables Icon

Table 3 Date rate comparison of existing single-LED OCC system

Equations (18)

Equations on this page are rendered with MathJax. Learn more.

P i = [ p i , 1 p i , 2 p i , j ] T ,
p i , j = [ r i , j g i , j b i , j ] ,
I R G B = [ P 1 P 2 P N ] = [ p 1 , 1 p 2 , 1 p N , 1 p 1 , 2 p 2 , 2 p N , 2 p i , j p 1 , K p 2 , K p N , K ] ,
S L E D = [ R ( λ ) G ( λ ) B ( λ ) ] , R R G B = [ r ( λ ) g ( λ ) b ( λ ) ] ,
p i , j = 255 × 0 σ d t 380 n m 780 n m η ( i , j , d ) δ R R G B I ( ϕ ) k D i m m i n g S L E D 1 d λ .
k D i m m i n g = [ k R k G k B ] T .
g i , j = 255 × 0 σ d t 380 n m 780 n m η ( i , j , d ) δ R M ( λ ) I ( ϕ ) k D i m m i n g S L E D 1 d λ ,
I G r a y = [ G 1 G 2 G N ] = [ g 1 , 1 g 2 , 1 g N , 1 g 1 , 2 g 2 , 2 g N , 2 g i , j g 1 , K g 2 , K g N , K ] ,
C = p i , j p i , j 1 = [ r i , j r i , j + g i , j + b i , j g i , j r i , j + g i , j + b i , j b i , j r i , j + g i , j + b i , j ] .
arg max { k 1 , k 2 , , k n } min C i C j 2 , ( i j ; C i , C j C R M ) .
C R M = { C 1 , C 2 , ... C n } , k C R M = { k 1 , k 2 , , k n } .
arg max { ϕ 1 , ϕ 2 , ... , ϕ m } min G i G j 1 , ( i j ; G i , G j ( G L M { 0 } ) ) ,
G L M = { G 1 , G 2 , G m } , k G L M ( C k ) = { ϕ 1 , ϕ 2 , ... , ϕ m } .
t R G B = [ I R G B ] : , K / 2 = [ p 1 , K / 2 p 2 , K / 2 p N , K / 2 ] t G r a y = [ I G r a y ] : , K / 2 = [ g 1 , K / 2 g 2 , K / 2 g N , K / 2 ] ,
[ t R G B ' ] i = [ t R G B ] i + 1 1 [ t R G B ] i 1 , ( 1 i N 1 ) [ t R G B ' ] i = [ t G r a y ] i + 1 [ t G r a y ] i , ( 1 i N 1 ) ,
e m a x ( R G B ) = f i n d p e a k s ( t R G B ' ) , e m i n ( R G B ) = f i n d p e a k s ( t R G B ' ) e m a x ( G r a y ) = f i n d p e a k s ( t G r a y ' ) , e m i n ( R G B ) = f i n d p e a k s ( t G r a y ' ) .
arg min C C R M i = s t a r t e n d C p K / 2 , i 2
arg min G G L M ( C ) i = s t a r t e n d G g K / 2 , i 2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.