Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Objective visual comfort evaluation method based on disparity information and motion for stereoscopic video

Open Access Open Access

Abstract

In this paper, we propose an objective evaluation method which considers three major factors affecting the visual comfort of viewing a stereoscopic video: the horizontal disparity, the vertical disparity, and the motion. Three experiments are conducted which aim to investigate the effect of three factors on visual comfort. The experimental results show that the visual comfort decreases with increasing the disparity magnitude and motion velocity. Both fast planar motion and depth motion have a negative impact on visual comfort. Finally, we verify the performance of the proposed whole evaluation measure. The experimental results show that the proposed objective evaluation method exhibits good consistency with the subjective evaluation results.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

With the development of stereoscopic contents related industry, it is possible to play stereoscopic video on TV, Internet and mobile device and it appears changes with each passing day [1, 2]. However, unwanted side effects in the form of different types of visual discomfort can occur while one is participating in the stereoscopic experience [3]. Stereoscopic video which has excessive disparity or depth inconsistency can induce unwanted visual discomfort, such as asthenopia, eye strain, headache, and other phenomena leading to a bad viewing experience [4]. Visual discomfort has a negative impact on the development of stereoscopic display technology and related industries. The demand for improving visual comfort for stereoscopic video is becoming more and more urgent.

Many researchers have investigated various factors which decrease visual comfort including the conflict of accommodation and vergence [5], disparity distribution [6], vertical disparity [7,8], prolonged viewing [9], viewing distance [10], depth inconsistencies and perceptual and cognitive inconsistencies [11]. Especially, the conflict between vergence and accommodation is the main one inducing visual discomfort when watching stereoscopic video among the above-mentioned factors [4,12]. Binocular horizontal disparity provides a depth clue for the visual perception of human eyes. However, excessive binocular horizontal disparity will aggravate the conflict between vergence and accommodation, which in result causes visual discomfort. Therefore, the binocular horizontal disparity is a very important factor for quantifying the visual discomfort of stereoscopic video.

Several studies have focused on the problems of visual discomfort arising from horizontal disparity. Kooi and Toet [13] and Speranza et al. [14] investigated the effect of excessive horizontal disparity on visual comfort. The studies for determining the comfortable limit of horizontal disparity are reported in [12,15]. Shibata et al. [12] measured the comfortable range of horizontal angular disparity at different viewing distances by subjective experiments. 3D Consortium [15] established safety guidelines for creating comfortable stereoscopic contents and preventing visual discomfort by recommending that disparity should be maintained at less than +/−1 degree. Figure 1 shows the comfortable range plotted as horizontal disparity in degrees as a function of the viewing distance in meters. The dotted lines indicate +/−1 degree, the solid lines represent the comfortable limit in [12]. It can be intuitively seen from Fig. 1 that the comfortable limit in [12] is greater than +/−1 degree, and the effect of uncrossed disparity (negative value) and crossed disparity (positive value) on visual comfort is not symmetrical strictly. If considering the change of viewing distance, crossed disparity is more likely to cause visual discomfort than uncrossed disparity at a short distance (less than 1m); in long-distance conditions (more than 1m), uncrossed disparity is more prone to cause visual discomfort than crossed disparity. Compared with the “classical” +/−1 degree, the comfortable range in [12] shows greater meaning and value in discussing positive disparity, negative disparity and experimented conditions (various viewing distances). Several visual comfort evaluation methods based on horizontal disparity are proposed in [16, 17]. Kim and Sohn [16] proposed three metrics to predict visual discomfort, including range of pixel disparities and maximum angular disparities. The results in [16] indicated that the range of maximum angular disparities had a high correlation with the outcomes of the subjective assessments. Jung et al. [17] provided a disparity feature vector which consists of the saliency-weighted disparity and maximum disparity to predict visual comfort scores for stereoscopic images. Experimental results demonstrated that the method had a close correlation with the subjective mean opinions scores (MOS).

 figure: Fig. 1

Fig. 1 Comfortable ranges for horizontal disparity.

Download Full Size | PDF

There have been several studies investigating the effect of vertical disparity on visual discomfort. Most vertical disparity appears when stereoscopic images are generated by using a converged camera configuration. Other causes of vertical disparity include bracket deformation, inconsistencies of internal photosensitive elements of the camera, etc. The authors in [7,8] showed that vertical disparities are tightly correlated with visual discomfort. Kooi and Toet [13] provided a threshold value of vertical disparity, which indicates a vertical shift should be less than 0.57 degrees. Three visual discomfort prediction metrics were proposed based on excessive vertical disparities [16]. The results in [16] indicated that the metric which utilized keystone distortion showed the best performance. Sasaki et al. [18] evaluated visual discomfort caused by viewing three-dimensional images with vertical disparity based on a simulator sickness questionnaire and a critical flicker frequency test. Experimental results showed that vertical disparity had an adverse effect on human vision.

In analyzing the factors that affect the visual comfort of stereoscopic video, it is well known that the motion of the object is related to the visual discomfort [14,19,20]. Speranza et al. in [14] addressed that larger amount of motion activity can induce visual discomfort even if the object is within the comfortable viewing zone. Motion in stereoscopic videos can be classified into planar motion and in-depth motion [20]. Planar motion means that the object moves within a plane parallel to the screen plane, the disparity does not change temporally. On the other hand, in-depth motion is defined as that the object moves towards or away from the observers. It can be understood as the disparity variation in the temporal domain. The intense motion can cause large disparity change in a short time, so it exacerbates the burden of human eyes and leads to visual discomfort.

In the past few years, many evaluation methods based on the concept of perception were proposed to measure the degree of visual discomfort. The perception of visual comfort is a complicated process that combined with various factors in psychology, physiology and cognition for stereoscopic contents. Ren et al. investigated the effect of region contrast on visual comfort [21]. Kim et al. investigated the effect of the binocular disparity on the visibility threshold of asymmetric noise via subjective assessments [22]. Xue et al. proposed a disparity-based just-noticeable-difference model (DJND) to improve visual comfort in stereoscopic videos [23]. Shi et al. proposed a visual comfort model based on Weber-Fechner’s Law from a psychological point [24], and the authors in [25] used functional magnetic resonance imaging (FMRI) to evaluate the visual comfort for stereoscopic video objectively from a physiological point.

Choi et al. selected depth position, scene movement, depth gradient, and brightness as visual discomfort factors and proposed an objective evaluation as a linear combination of the selected visual discomfort factors [26]. Their method demonstrated good performance in terms of correlation with the subjective visual discomfort. However, their model did not consider the effect of vertical disparity on visual discomfort, and all visual discomfort factors are pixel values rather than angular values, which may not be able to well maintain correlation with subjective evaluation under different viewing conditions (including the display size, viewing distance).

In this paper, we select horizontal, vertical disparity and motion as visual comfort factors and focus on the comprehensive effect of horizontal, vertical disparity and motion on visual comfort. Three visual comfort factors are all calculated in angular units. For horizontal and vertical disparity, a visual comfort evaluation model is established considering visual comfortable range of each factor. A visual comfort evaluation model for motion is established, which is divided into planar motion, depth motion and composite motion. Two overall visual comfort measures are proposed for stereoscopic image and video, respectively.

The rest of the paper is organized as follows. The proposed method is introduced in Section 2. In Section 3, we explain the experiments for subjective assessments and present experimental results obtained. Finally, we conclude our works on the proposed method for 3D video and figure out some points which can be improved in the future in Section 4.

2. Proposed evaluation method

We propose an objective evaluation method for visual comfort, which consists of three steps. Firstly, we utilize the scale-invariant feature transform (SIFT) [27, 28] to extract the feature points and calculate disparity value. Secondly, the visual comfort factors of the horizontal, vertical disparity and motion are defined. Finally, we establish objective evaluation measure to estimate the degree of visual comfort quantitatively.

2.1 Disparity extraction

In general, in order to obtain more accurate disparity information, the disparity is calculated for each pixel in the image. However, each pixel cannot always be reliably matched in the area of low texture or occlusion. The authors in [16] performed experiments on both sparse and dense matching to extract disparities, and measured the correlation of the predicted results with the subjective results. They found that sparse feature showed a relatively high correlation than dense feature. Therefore, we extract disparities based on SIFT which is a sparse matching algorithm. In addition, we apply the random sample consensus (RANSAC) algorithm to eliminate false matching feature points, which can get reliable corresponding features and disparities. The horizontal and vertical disparities in pixels are denoted by

dh=XrXl,
dv=YrYl.
where Xr and Yr represent the horizontal and vertical positions of the feature point in the right viewing image, respectively; Xl and Yl represent the horizontal and vertical positions of the feature point in the left viewing image, respectively.

2.2 Visual comfort factors

In order to predict visual comfort for stereoscopic videos, we present an objective evaluation method using three visual comfort factors: the horizontal disparity, the vertical disparity and the motion.

1) Horizontal angular disparity factor

As shown in Fig. 2, PelandPer represent the positions of the left eye and the right eye of the viewer, respectively.Pd(Xd,Yd,Zd) represents the object in stereoscopic display. We assume that Pi(Xi,Yi)andPs(Xs,Ys) represent the positions corresponding to Pd(Xd,Yd,Zd)for the image and display, respectively. The relationship between Pi(Xi,Yi)andPs(Xs,Ys)is calculated by:

Xs=(XiRh2)widthRh,Ys=(YiRv2)heightRv.
where width and height represent the size of the display and Rh and Rvrepresent the resolution of the image.

 figure: Fig. 2

Fig. 2 Geometric description of stereoscopic display.

Download Full Size | PDF

The relationship between Pd(Xd,Yd,Zd) and Ps(Xs,Ys)is calculated by:

Xd=IXsIdsh(x,y),Yd=IYsIdsh(x,y),Zd=dsh(x,y)Ddsh(x,y)I,
where
dsh(x,y)=dh(x,y)widthRh.
where Zd is the distance between an object and the display. dh(x,y)anddsh(x,y)represent the horizontal disparity in pixels and the horizontal screen disparity in millimeter of a given feature point in a stereoscopic image pair, respectively. As shown in Fig. 2, ZdisZd1 in the situation that the object is behind the screen (uncrossed disparity) and the value of Zd1is negative, the value of dsh(x,y)is positive. While Zdis Zd2 in the situation that the object is in front of the screen (crossed disparity) and the value of Zd2 is positive, the value of dsh(x,y) is negative. D and I are the viewing distance and the inter-ocular distance, respectively.

The horizontal angular disparity factor of a given feature point in a stereoscopic image is calculated by

dah(x,y)=2[arctan(I2(DZd))arctan(I2D)].

2) Vertical angular disparity factor

In order to obtain the vertical angular disparity, we use the method in [16]. The vertical angular disparity factor of a given feature point in a stereoscopic image can be calculated by Eq. (7), which only computes the angle of the Y component in three-dimensional space vectors.

dav(x,y)=cos1(ab|a||b|)cos1(cd|c||d|)=cos1(YdlYdr+D2Ydl2+D2Ydr2+D2),
where

a=PelPdl=(XeI/2Xdl,YeYdl,ZeZdl),
b=PerPdr=(Xe+I/2Xdr,YeYdr,ZeZdr),
c=Pel(Pdl+Pdr)/2,d=Per(Pdl+Pdr)/2.

3) Motion factors

The motion velocity can be expressed by the change of motion characteristic vector per second. We calculate three directions of the motion velocities which are horizontal, vertical and depth and define planar motion and depth motion. Calculation of the motion refers to Jing Li’s method in [29]. The main idea is briefly described as follows:

Geometric description of calculating the motion is shown in Fig. 3. As shown in Fig. 3, assuming there is an object moving from ObjktoObjk+1, the coordinates of the left and right views of the object in the kth frame are (xlk,ylk) and (xrk,yrk), and in the k + 1th frame they are (xlk+1,ylk+1)and (xrk+1,yrk+1). Objkcis created as a compensated object, which has the same depth as Objk+1. Objkc and Objk are on the same line, which passes through the center of (xlk,ylk)and (xrk,yrk). The coordinates of the left and right views of Objkc are (xlc,ylc)and (xrc,yrc)on the screen.

 figure: Fig. 3

Fig. 3 Geometric description of calculating the motion (Ref [29], Fig. 10.3).

Download Full Size | PDF

Let's take the right view as an example. 2D motion is described by the moving distance of the object between adjacent frames of the same viewpoint. For instance,

M2Dx(xrk,yrk,k)=xrk+1xrk.
where M2Dx(xrk,yrk,k) represents a 2D motion map at x direction. However, the distance between xrkand xrcis induced by depth motion. The “real” 2D motion component is betweenxrcand xrk+1. Therefore, the “real” planar motion maps at x and y directions are:
M3Dx(xrk,yrk,k)=M2Dx(xrk,yrk,k)12M3Dd(xrk,yrk,k),
M3Dy(xrk,yrk,k)=M2Dy(xrk,yrk,k)=yrk+1yrk,
where
M3Dd(xrk,yrk,k)=d(xrk+M2Dx(xrk,yrk,k),yrk+M2Dy(xrk,yrk,k),k+1)d(xrk,yrk,k)=d(xrk+1,yrk+1,k+1)d(xrk,yrk,k).
where M3Dx(xrk,yrk,k) and M3Dy(xrk,yrk,k)represent the “real” planar motion maps at x and y directions of a 3D motion map, respectively. M3Dd(xrk,yrk,k) represents the depth motion which measures the disparity amplitude difference of the object between adjacent frames of the left and right views. d(xrk,yrk,k) represents the horizontal disparity of the kth frame.

Based on the above method of calculation of the motion, we define the motion factors with the unit of “degree/s” by:

mk,x=f(cos1(ek+1x,zekx,z|ek+1x,z||ekx,z|)12fmk,z),
mk,y=fcos1(ek+1y,zeky,z|ek+1y,z||eky,z|),
mk,z=2f[arctan(I2(DZdk+1))arctan(I2(DZdk))],
where mk,x,mk,yand mk,z are the motion velocities of a given feature point of the kth frame in horizontal, vertical and depth directions, respectively. f represents the number of video frames per second. ekx,z and eky,zare the project vectors of ekon x,z plane and the y,z plane, respectively. And

ek=PrkPer=(xrk(xeI/2),yrkye,zrkze),
ekx,z=(xrk(xeI/2),zrkze),
eky,z=(yrkye,zrkze),
Zdk=dshk(x,y)Ddshk(x,y)I.

Then the planar motion velocity can be expressed by the amount of the change of the motion characteristic vector at x and y directions per second, which is defined by

mplanar(x,y)=(mk,x)2+(mk,y)2.

The depth motion velocity can be interpreted as the amount of the change of the motion characteristic vector at z direction per second, which is defined by

mdepth(x,y)=|mk,z|.

2.3 Objective evaluation of visual comfort

After determining the visual comfort factors for the horizontal, vertical disparity and the motion, we use their values to establish mathematical models that can reflect the degree of visual comfort, respectively. Then two overall objective evaluation measures of visual comfort for stereoscopic image and video are presented, respectively.

  • 1) Horizontal angular disparity model: For horizontal angular disparity, the comfortable range is often considered to be within +/−1 degree of visual angle, and the authors in [12] provided a comfortable limit that varies with viewing distance. We set up the evaluation models with two different comfortable ranges and compare both performances later.

    The model1 is defined by:

    wdh1(x,y)={exp(th|dah(x,y)|)|dah(x,y)|th10<|dah(x,y)|<th,

    The model2 is defined by:

    wdh2(x,y)={exp(dah(x,y)thn)dah(x,y)thnexp(thpdah(x,y))dah(x,y)thp1thn<dah(x,y)<thp,

    where

    thn=2[arctan(I(1TnD)2×mnD)arctan(I2D)],
    thp=2[arctan(I(1+TpD)2×mpD)arctan(I2D)].

    where this 1 degree,thn and thp are the threshold of the comfortable zone of negative (uncrossed) and positive (crossed) horizontal angular disparity, respectively [12]. D is the viewing distance. As shown in Fig. 4, mn=1.129 and mp=1.035 are the slopes of the negative and positive lines, respectively, and Tn=0.442 and Tp=0.626 are the ordinate intercepts for the negative and positive lines, respectively [12]. The black diagonal line represents natural viewing.

 figure: Fig. 4

Fig. 4 The zone of comfort in diopter (Ref [12], Fig. 23).

Download Full Size | PDF

We refer to the method of Oh and Ham in [30] and use the exponential decay function to establish objective evaluation models. For the model1 and the model2, the exponent is used for normalization. The value closer to one represents the better visual comfort experience. Taking Eq. (24) as an example, if dah(x,y)is within the threshold of the comfortable zone, the degree of visual comfort score is one, which means the degree of visual comfort is assumed to be very comfortable. Otherwise, if wdh1(x,y)gets closer to zero as dah(x,y) being away from the threshold of the comfortable zone, the degree of visual comfort is predicted to be extremely uncomfortable.

Figure 5 shows the model1 and the model2 as a function of horizontal disparity in degrees. Four different viewing conditions are considered for the model2, the viewing distances (D) are approximately 1、2、3、4 times of the screen width (W), respectively. In order to more clearly reflect the exponential decay characteristics of these models, horizontal angular disparities are changed from −6° to + 6° in increments of 0.2 degree, which means that the curve for each model is drawn by 60 data points. It should be noted that horizontal angular disparity of 3 degree will cause extremely discomfort to human eyes in the actual stereoscopic video production or experimental research process. Therefore, the horizontal angular disparity range from −6° to + 6° is only for describing the curve shape of each model and will not be used for actual subjective experiments. As shown in Fig. 5, the shape of the model1 is left-right symmetric with zero disparity as the central axis, whereas the shape of the model2 is not bilaterally symmetrical and varies with the viewing distance.

 figure: Fig. 5

Fig. 5 Shapes of the model1 and the model2 for horizontal disparity.

Download Full Size | PDF

  • 2) Vertical angular disparity model: For vertical angular disparity, two models are considered. The model1 is proposed considering the magnitude of the vertical angular disparity. The model2 combines comfortable threshold of vertical angular disparity based on the model1. Kooi and Toet [13] provided the threshold of vertical angular disparity, which indicated that vertical angular disparity should be less than 0.57degree in order to prevent the visual comfort from decreasing. In this paper, considering the worst case, we choose 0.57 degree as the threshold of the comfortable zone of vertical angular disparity, and suppose that the vertical angular disparities which exceed the threshold will cause visual discomfort.

    The model is defined by:

    wdv1(x,y)=exp(dav(x,y)),

    The model2 is defined by:

    wdv2(x,y)={exp(thv|dav(x,y)|)|dav(x,y)|thv10<|dav(x,y)|<thv.

    where thv=0.57deg is the threshold of the comfortable zone of vertical angular disparity. If dav(x,y)is within the threshold of the comfortable zone, the degree of visual comfort score is one, which means the degree of visual comfort is predicted to be very comfortable. If wdv(x,y)gets closer to zero as dav=(x,y) exceeds the threshold of the comfortable zone, which means that the degree of visual discomfort is predicted to be severe.

  • 3) Motion model: the motion characteristics are classified into planar motion, depth motion and composite motion to discuss.
    wm(x,y)={exp(mplanar(x,y)σ1)planarmotionexp(mdepth(x,y)σ2)depthmotionexp(mplanar(x,y)+mdepth(x,y)σ3)compositemotion.

    where σ1,σ2and σ3 are control parameters. The larger the value of the motion characteristic is, the closer the value of wm(x,y)is to zero. It means that the degree of visual discomfort is predicted to be more severe as the motion velocity becomes faster.

  • 4) Overall objective evaluation measure of visual comfort:

    The overall objective evaluation measure of visual comfort which is proposed in the paper for one frame is defined by

    vck=1Ni=1N[awdhi(x,y)+bwdvi(x,y)+c].

    where N represents the number of feature points in the kth frame.a,b and care coefficients. The evaluation measure is the average value of the linear summation of the visual comfort factors.

    The proposed evaluation method for multiple frames is given by

    VC=1Mk=1Mvck,

    where

    vck=1Ni=1N[αwdhi(x,y)+βwdvi(x,y)+γwmi(x,y)+δ].

    where M represents the number of frames.α,β,γ and δare coefficients, which are obtained by training the test sequences. Comparing with Eq. (31), Eq. (32) incorporates the motion factor, which reflects the dynamic content change between adjacent images of the stereoscopic video.

3. Experimental results and analysis

3.1 Subjective experiments

3.1.1 Apparatus

The experimental display platform is NVIDIA 3D Vision configured as Intel CoreTM2 Duo 3GHz processor and GeForce GTX 240 graphics card. The three-dimensional video player software (Stereoscopic Player) is utilized to play 3D videos. We utilize a Sony VPL-HW30ES stereoscopic video projector to show stereoscopic videos. The width and height of the projection screen are 886 mm and 498 mm, respectively. The resolution of the projector is 1920 × 1080.The viewing distance is approximately three times of the height of the screen, i.e., 1.5m. The Sony active shutter glasses with a 3D-TDJ1 transmitter are used in our experiments. The design of the experimental environment is in line of the recommendations of ITU-R BT.500-1332 [31]. The environmental luminance on the screen is 200 lux.

3.1.2 Subjects

Twenty subjects of eight males and twelve females with an average age of 25 years between 20 to 35 years participated in the experiments. All subjects are inexperienced assessors, and they pass the medical condition checking without stereoscopic blindness and color blindness. Before the test procedure, instructions for experimentation and the grading task are provided to ensure that we can get more reasonable evaluation results.

3.1.3 Video source

In order to verify the actual performance of the proposed objective measure for visual comfort, we test the standard stereoscopic video sequences from the IEEE Standard Association (IEEE-SA) stereo video database [32] and IVY LAB stereoscopic video database [33]. The IEEE-SA stereo video database contains 64 stereo videos of resolution (1920 × 1080 pixels), which is divided into two parts: Version1 and Version2. 25 stereoscopic videos captured by the twin-lens PANASONIC AG-3DA1 3D camcorder are used in our experiments. Figure 6(a) shows that the overall horizontal disparity distribution of 25 stereoscopic videos ranges from −3° to + 3° with a concentration of −1° to + 1°. Figure 6(b) shows that the overall vertical disparity distribution of 25 stereoscopic videos with a mean near zero. Since the proposed objective evaluation method considers the influence of vertical disparity on visual comfort, we need to adjust the vertical disparity of the stereoscopic videos to test the whole evaluation method. We consider that the optical axes of the two cameras tend to be approximately horizontal and the angle of the optical axes is not too large during the actual shooting, so the test range of the vertical disparity is limited within 1 degree which is under four different conditions: 0.25°, 0.5°, 0.75°, 1°.

 figure: Fig. 6

Fig. 6 Disparity distribution of 25 stereoscopic videos in IEEE-SA stereo video database:(a) horizontal disparity and (b) vertical disparity.

Download Full Size | PDF

Before verifying the overall objective method, we first conduct visual comfort subjective evaluation tests of single-factor for the horizontal disparity, the vertical disparity and the motion respectively. We examine disparity distribution for each stereoscopic video pair, and try to make the scene of each experimental object different from other objects. According to the evaluation scores of the observers and the different characteristics of the experimental scenes, Car1 and Restaurant1 with resolution 1920 × 1080 pixels in IEEE-SA stereo video database are chosen as the original materials to produce experimental sequences. The vertical disparities of two experimental sequences are zero, and the horizontal disparities are within the comfortable range. They are confirmed by most observers as the stereoscopic videos with good visual comfort. We generate different video sequences by adjusting the horizontal disparity from −3° to + 3°. The video processing software (After Effects, AE) [34] is used to adjust the disparity and generate the test video sets with varying conditions. With the increase of the adjusted horizontal disparity value, the overlapping area of the image will be significantly reduced. In order to ensure that the overlapping rectangular area is the largest as much as possible, we divide the horizontal disparity into half to allocate the left and right images to make adjustments. For instance, we want to get an experimental object with horizontal disparities of 2° under the condition of 40-inch display size and viewing distance of 1.5m. Horizontal disparities of each frame of test videos need to be adjusted to −113 pixels. The left image is moved to the right by 56.5 pixels horizontally, and the right image is moved 56.5 pixels to the left horizontally. Examples of two frames of experimental sequences with horizontal disparity of 2° are showed in Fig. 7. Each frame or image includes the images of the left view and the right view simultaneously.

 figure: Fig. 7

Fig. 7 Examples of two frames of experimental sequences with horizontal disparity of 2°: (a) Car1 and (b) Restaurant1.

Download Full Size | PDF

Then we generate different video sequences by adjusting the vertical disparity from −1° to + 1°. When we want to get an experimental object with vertical disparities of 1°, the left image is moved up to 0.5° vertically, and the right image is moved 0.5° along the downward direction vertically. Figure 8 shows examples of two frames of experimental sequences with vertical disparity of 1°.

 figure: Fig. 8

Fig. 8 Examples of two frames of experimental sequences with vertical disparity of 1°:(a) Car1 and (b) Restaurant1.

Download Full Size | PDF

We select 24 stereoscopic videos of IVY LAB stereoscopic video database to examine the influence of motion on visual comfort. The database contains 36 natural scenes captured by a 3D digital camera with dual lenses (Fujifilm FinePix 3D W3). The stereoscopic videos have various movement velocities for each of three motion directions (horizontal, vertical, and depth directions). And maximum binocular horizontal disparities of all stereoscopic videos are distributed within comfortable zone of ± 1◦.

3.1.4 Scoring criteria

Subjective evaluation experiments are conducted using the Single Stimulus (SS) method of ITU-R BT.500-11 [35]. The subjective visual comfort score ranges from 1 to 5 which corresponds to five visual comfort levels: 1 = extremely uncomfortable, 2 = uncomfortable, 3 = mildly comfortable, 4 = comfortable and 5 = very comfortable [36]. Each test video is presented to subjects for 20 sec, and the following 10 sec is used to determine the score. During the time of determining the score, the display screen displays a mid-gray image. ANOVA (Analysis of Variance) is used as the main statistical analysis method to detect significance. Because the subjective scores have individual differences significantly, it is necessary to examine the distribution of the evaluation score made by each subject. For the subject whose evaluation score is abnormal, we exclude all the experimental data of the subject. A corresponding elimination rule based on ITU-R BT.500-11 is established. Assume that the average score of subjective evaluation for a stereoscopic video isμi, the standard deviation of the entire evaluation score is σi, and the interval for each stereoscopic video is [μiσi,μi+σi] . For the ith subject, the number of times that his subjective evaluation score is greater than μi+σi is recorded as Pi, the number of times that his subjective evaluation score is less thanμiσi is recorded as Qi . Assume that the number of stereoscopic videos evaluated by the subject isTi, the elimination rule of subjective evaluation score is defined by:

{(Pi+Qi)/Ti>5%|PiQi|/(Pi+Qi)<30%.

When the subjective evaluation scores of the subject satisfy the elimination rule, the evaluation scores of the subject are considered abnormal and we need to remove all the experimental data of the subject.

We first ask all subjects to watch the selected 25 stereoscopic videos in the IEEE-SA database. During the experiment, different orders of test videos are presented in a series of tests. In the process of data analysis, we find that one of the subjects provided abnormal subjective evaluation data. We exclude the abnormal data of the subject and average the remaining subjective evaluation data as the final valid subjective evaluation scores.

3.2 Results for horizontal disparity

First, we present an experiment to study the effect of horizontal disparity on visual comfort and compare both performance of the model1and the model2. The horizontal angular disparities of test videos are changed from −3° to + 3° in 0.5 degree increment. The subjective scores of the subjects are averaged to indicate the degree of visual comfort. The objective evaluation scores are obtained from the established horizontal angular disparity model. We use simple linear regression to acquire the best correlation between subjective and objective evaluation scores, which isvch=1MNk=1Mi=1N[awdhi(x,y)+b], whereNrepresents the number of feature points in the kth frame, Mrepresents the number of frames. The experimental results of subjective and objective visual comfort evaluation scores for horizontal disparity are shown in Fig. 9.

 figure: Fig. 9

Fig. 9 Experimental results of subjective and objective visual comfort evaluation scores for horizontal disparity: (a) the model1, (b) the model2. Error bars indicate the standard deviation of all evaluation scores under a given condition.

Download Full Size | PDF

The results of the objective evaluation scores of the model1 are shown in Fig. 9(a). The objective evaluation scores of the model2 are shown in Fig. 9(b). As shown in Fig. 9, the visual discomfort increases with the horizontal angular disparity magnitude (F(12,234) = 39.709, P<0.001). When the comfortable range of horizontal angular disparity is +/−1 degree, the objective evaluation scores remain constant in Fig. 9(a) from −1° to + 1°. As shown in Fig. 9(b), the comfortable range of horizontal angular disparity in Eq. (26) and Eq. (27) is [-1.6878°, 2.0998°] in the case of 1.5m viewing distance. When the horizontal angular disparity is within the comfortable range, the value of wdh(x,y) in the objective evaluation model is 1, and the final visual comfort evaluation score is a fixed value. Therefore, the objective evaluation scores do not change in Fig. 9(b) from −1° to + 2°. Coefficients of two models for horizontal disparity are given in Table 1. The linear fitting may be sufficient to explore the relationship between subjective and objective visual comfort evaluation as can be seen by the correlation coefficients (R square). As shown in Table 1, R square of the model2 is 0.915, which indicates that the model2 has a higher fitting degree with the subjective evaluation scores than the model1.

Tables Icon

Table 1. Coefficients of two models for horizontal disparity.

3.3 Results for vertical disparity

Another experiment is presented to investigate the effect of vertical disparity on visual comfort. The vertical angular disparities of test videos are changed from −1° to + 1° in 0.2 degree increment. The experimental results of subjective and objective visual comfort evaluation scores for vertical disparity are plotted in Fig. 10. As shown in Fig. 10, the degree of visual discomfort is more severe as vertical angular disparity increases. The curves of subjective and objective visual comfort evaluation for vertical disparity are basically symmetric with zero disparity as the central axis. It indicates that there is no difference between the positive and negative values of vertical disparity on visual comfort. Figure 10(a) shows the results of the objective evaluation scores which are given by the model1. Taking the right half with positive disparity as an example, the model1 tends to decrease monotonically with the increase of vertical angular disparity. The objective evaluation scores are given by the model2 in Fig. 10(b). The objective evaluation scores remain constant within the comfortable range in Fig. 10(b). When vertical angular disparity increases beyond the comfortable range, the objective evaluation scores decline. Coefficients of two models for vertical disparity are given in Table 2. As shown in Table 2, R square of the model2 is 0.936, which indicates that the model2 has a higher fitting degree with the subjective evaluation scores than the model1.

 figure: Fig. 10

Fig. 10 Experimental results of subjective and objective visual comfort evaluation scores for vertical disparity: (a) the model1, (b) the model2. Error bars indicate the standard deviation of all evaluation scores under a given condition.

Download Full Size | PDF

Tables Icon

Table 2. Coefficients of two models for vertical disparity.

3.4 Results for motion

The third experiment is presented to examine the effect of motion on visual comfort. The motion is distinguished into two cases: motion velocity (slow and fast), motion direction (planar motion, depth motion and composite motion). Since selected stereoscopic videos are not computer-generated, the motion velocity is not strictly increased at a fixed rate. We sort the motion velocity of stereoscopic videos from low to high, and divide the first four and the last four stereoscopic videos into two groups. The motion velocity of each set of stereoscopic videos is averaged, and the average value is used as a level of the motion velocity. For the planar motion, the levels of the motion velocity with 4.27deg/s and 22.57deg/s represent slow and fast motion, respectively. For depth motion, the levels of the motion velocity with 0.13deg/s and 2.675deg/s represent slow and fast motion, respectively. Due to the complexity of the composite motion, we do not distinguish and discuss the motion velocity. 24 stereoscopic videos (3 motion directions× 8 stereoscopic videos) are used to test. The experimental results of subjective and objective visual comfort evaluation scores for motion are shown in Fig. 11.

 figure: Fig. 11

Fig. 11 Experimental results of subjective and objective visual comfort evaluation scores for motion: (a) planar motion, (b) depth motion and (c) composite motion. Error bars indicate the standard deviation of all evaluation scores under a given condition.

Download Full Size | PDF

As shown in Fig. 11(a) and Fig. 11(b), there is a significant difference in the visual comfort evaluation scores for fast and slow planar motion (F(1,150) = 15.046, P = 0.012) and for fast and slow depth motion (F(1,150) = 37.52, P = 0.002), respectively. Therefore, the visual discomfort increases with motion velocity. And both fast planar motion and depth motion have a negative impact on visual comfort for stereoscopic videos. We test eight stereoscopic videos that contain both planar motion and depth motion. We can observe from Fig. 11(c) that subjective evaluation scores and objective evaluation scores have good consistency. Coefficients of three models for motion are given in Table 3.

Tables Icon

Table 3. Coefficients of three models for motion.

3.5 Results for the whole evaluation measure

The proposed method is tested on available IEEE-SA stereo video database. 25 stereoscopic videos are used to test. The vertical disparities of all the test videos are under four different conditions: 0.25°, 0.5°, 0.75°, 1°, and the horizontal disparities of the test videos remain basically unchanged. Eventually we obtain a total of 100 data sets (25 stereoscopic videos 4 × vertical disparities) of visual comfort subjective evaluation. Each data set contains subjective evaluation scores for all subjects under specific viewing distance and display size. We calculate the average score for all subjective scores in each data set as the final visual comfort score. 70 data of the final visual comfort score is selected randomly as the training data to get the appropriate coefficients in Eq. (33), and the remaining 30 data is used for the test data to verify the performance of the evaluation measure.

Table 4 gives the experimental results of verification. We carry out five tests for 100 experimental data. The Pearson linear correlation coefficient (PLCC) and Spearman’s rank ordered correlation coefficient (SROCC) are used to evaluate image quality metrics. PLCC is used to measure the evaluation accuracy, and SROCC is used to measure the evaluation monotonicity. The larger the values of PLCC and SROCC are, the better the correlation between predicted visual comfort score and the subjective visual comfort score is. As shown in Table 4, PLCC is 0.931 and SROCC is 0.917 for the training videos, PLCC is 0.904 and SROCC is 0.881 for the test videos.

Tables Icon

Table 4. Results of verification.

The coefficients of the evaluation measure are calculated through training and regression fitting, so as to obtain the optimal correlation between the objective evaluation scores and the subjective evaluation scores. We utilize the SPSS (Statistical Product and Service Solutions) software [37] when estimating the regression coefficients. Regression coefficients for five tests are given in Table 5. As can be seen from Table 5, each regression coefficient is basically stable within a certain range of value in five tests. The best coefficients for regression areα=3.804,β=1.785,γ=2.407, δ=2.657. Due to the complexity of the scene of test videos, the motion characteristic is fitted according to the mathematical evaluation model for the composite motion. Therefore, the control parameterσ3=2.357.

Tables Icon

Table 5. Regression coefficients for five tests

Figure 12 shows a scatter plot between subjective and objective evaluation visual comfort scores. As can be observed, our proposed evaluation method exhibits good consistency with subjective assessment results.

 figure: Fig. 12

Fig. 12 Experimental results of the whole evaluation method.

Download Full Size | PDF

In addition, we compare the proposed method with prior objective evaluation methods. 100 experimental data of 25 test videos is used to test the evaluation method proposed by Choi et al. [26], Kim and Sohn [16], and Chen et al. [38]. Similarly, 70 experimental data is randomly selected for training and 30 experimental data for testing. After five tests, each evaluation measure can be determined the best coefficients and get the optimal correlation coefficients. The performance comparisons with three visual comfort objective evaluation methods are given in Table 6. As shown in Table 6, the proposed method achieves better objective evaluation performance of visual comfort than the other three objective evaluation methods.

Tables Icon

Table 6. Performance comparisons of different visual comfort objective evaluation methods.

4. Conclusion

In this paper, we first establish visual comfort evaluation models with respect to the disparity information and motion for stereoscopic video, respectively. Horizontal, vertical disparity and motion are all measured in angular units. The influence of the horizontal, vertical disparity and motion on visual comfort is given quantitatively in the proposed method. In addition, we conduct visual comfort subjective evaluation tests of single-factor for the horizontal, vertical disparity and motion and give the objective evaluation value of each factor. After analyzing the experiment results, we find that: (1) for horizontal and vertical disparities: the visual discomfort increases with the horizontal and vertical disparity magnitude. In the process of establishing the objective evaluation model, introducing appropriate comfortable ranges can make the subjective and objective evaluation results have a good correlation. (2) For motion: the visual discomfort increases with motion velocity; both fast planar motion and depth motion have a negative impact on visual comfort for stereoscopic videos. For the whole visual comfort evaluation method, the experimental results show that the proposed method can well reflect the impact of disparity information and motion on stereoscopic video visual comfort. The proposed objective evaluation values are consistent with the results of human subjective perception of stereoscopic video visual comfort. Therefore the proposed method provides a basis for evaluating stereoscopic video comfort. Although the method has a good performance, there is still much room for improvement. The study on the effect of other factors quantitatively can also be considered in the future work.

Funding

National Key R&D Program of China (2017YFB1002900), and the project of National Natural Science Foundation of China under Grant 61271315 and 61631009.

References and links

1. H. Hwang, H. S. Chang, and I. S. Kweon, “Local deformation calibration for autostereoscopic 3D display,” Opt. Express 25(10), 10801–10814 (2017). [CrossRef]   [PubMed]  

2. J. S. Chen, Q. Y. J. Smithwick, and D. P. Chu, “Coarse integral holography approach for real 3D color video displays,” Opt. Express 24(6), 6705–6718 (2016). [CrossRef]   [PubMed]  

3. J. Park, H. Oh, S. Lee, and A. C. Bovik, “3D visual discomfort predictor: analysis of horizontal disparity and neural activity statistics,” IEEE Trans. Image Process. 24(3), 1101–1114 (2015). [CrossRef]   [PubMed]  

4. M. Urvoy, M. Barkowsky, and P. L. Callet, “How visual fatigue and discomfort impact 3D-TV quality of experience: a comprehensive review of technological, psychophysical, and psychological factors,” annals of telecommunications - annales des télécommunications 68(11–12), 641–655 (2013). [CrossRef]  

5. M. Emoto, T. Niida, and F. Okano, “Repeated Vergence Adaptation Causes the Decline of Visual Functions in Watching Stereoscopic Television,” J. Disp. Technol. 1(2), 328–340 (2005). [CrossRef]  

6. D. Kim, S. K. Kim, and K. Sohn, “Effect of parallax distribution and crosstalk on visual comfort in parallax barrier autostereoscopic display,” Opt. Eng. 54(5), 053107 (2015). [CrossRef]  

7. C. W. Tyler, L. T. Likova, V. Ramachandra, and S. Goma, “3D discomfort from vertical and torsional disparities in natural images,” Proc. SPIE 8291, 82910Q (2012). [CrossRef]  

8. F. Liu, Y. Niu, and H. Jin, “Keystone correction for stereoscopic cinematography,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 1–7.

9. W. A. Ijsselsteijn, H. D. Ridder, and R. Hamberg, “Perceptual factors in stereoscopic displays. The effect of stereoscopic filming parameters on perceived quality and reported eyestrain,” Proc. SPIE 3299, 282–291 (1998). [CrossRef]  

10. R. E. Patterson, “Human factors of 3-D displays,” J. Soc. Inf. Disp. 15(11), 861–871 (2007). [CrossRef]  

11. W. J. Tam, F. Speranza, S. Yano, K. Shimono, and H. Ono, “Stereoscopic 3D-TV: Visual Comfort,” IEEE Trans. Broadcast 57(2), 335–346 (2011). [CrossRef]  

12. T. Shibata, J. Kim, D. M. Hoffman, and M. S. Banks, “The zone of comfort: Predicting visual discomfort with stereo displays,” J. Vis. 11(8), 11 (2011). [CrossRef]   [PubMed]  

13. F. Kooi and A. Toet, “Visual comfort of binocular and 3D displays,” Displays 25(2–3), 99–108 (2004). [CrossRef]  

14. F. Speranza, W. J. Tam, and N. Hur, “Effect of disparity and motion on visual comfort of stereoscopic images,” Proc. SPIE 6055, 60550B (2006). [CrossRef]  

15. C. Shigeru, “3D consortium safety guidelines for popularization of human-friendly 3D,” Eizo Joho Media Gakkai Gijutsu Hokoku 30(34), 21–24 (2006).

16. D. Kim and K. Sohn, “Visual fatigue prediction for stereoscopic image,” IEEE Trans. Circ. Syst. Video Tech. 21(2), 231–236 (2011). [CrossRef]  

17. C. Jung, H. Liu, and Y. Cui, “Visual comfort assessment for stereoscopic 3D images based on salient discomfort regions,” in Proceedings of IEEE Conference on Image Processing (IEEE, 2015), pp.4047–4051. [CrossRef]  

18. K. Sasaki, M. Yoshizawa, N. Sugita, and M. Abe, “Evaluation of visual fatigue while watching artificial three-dimensional image with vertical parallax,” in Proceedings of IEEE Conference on Consumer Electronics (IEEE, 2015), pp. 666–667. [CrossRef]  

19. Y. J. Jung and S. Lee, “Visual comfort assessment metric based on salient object motion information in stereoscopic video,” J. Electron. Imaging 21(1), 59–79 (2012).

20. J. Li, M. Barkowsky, and P. L. Callet, “Visual discomfort of stereoscopic 3D videos: Influence of 3D motion,” Displays 35(1), 49–57 (2014). [CrossRef]  

21. H. Ren, Z. Su, C. Lv, and F. Zou, “Effect of region contrast on visual comfort of stereoscopic images,” Electron. Lett. 51(13), 983–985 (2015). [CrossRef]  

22. H. G. Kim, S. I. Lee, and Y. Man Ro, “Experimental investigation of the effect of binocular disparity on the visibility threshold of asymmetric noise in stereoscopic viewing,” Opt. Express 24(17), 19607–19615 (2016). [CrossRef]   [PubMed]  

23. F. Xue, C. Jung, and J. Kim, “Disparity-based just-noticeable-difference model for perceptual stereoscopic video coding using depth of focus blur effect,” Displays 42, 43–50 (2016). [CrossRef]  

24. J. Shi, L. Yun, X. Huang, Y. Tai, and Z. Chen, “Visual Comfort Modeling for Disparity in 3D Contents Based on Weber–Fechner’s Law,” J. Disp. Technol. 10(12), 1001–1009 (2014). [CrossRef]  

25. J. Yong, H. Sohn, and S. Lee, “Subjective and objective measurements of visual fatigue induced by excessive disparities in stereoscopic images,” Proc. SPIE 8648, 86480M (2013).

26. J. Choi, D. Kim, S. Choi, and C. Sohn, “Visual fatigue modeling and analysis for stereoscopic video,” Opt. Eng. 51(1), 017206 (2012). [CrossRef]  

27. S. H. Zhong, Y. Liu, and Q. C. Chen, “Visual orientation inhomogeneity based scale-invariant feature transform,” Expert Syst. Appl. 42(13), 5658–5667 (2015). [CrossRef]  

28. Z. H. Wang, H. Liu, and Z. Huo, “Scale-invariant feature matching based on pairs of feature points,” Computer Vision, IET. 9(6), 789–796 (2015). [CrossRef]  

29. J. Li, “Methods for assessment and prediction of QoE, preference and visual discomfort in multimedia application with focus on S-3DTV,” Nantes University (2014).

30. C. Oh, B. Ham, S. Choi, and K. Sohn, “Visual Fatigue Relaxation for Stereoscopic Video via Nonlinear Disparity Remapping,” IEEE Trans. Broadcast 61(2), 142–153 (2015). [CrossRef]  

31. ITU-R BT.500–13, “Methodology for the subjective assessment of the quality of television pictures,” (2012).

32. IEEE-SA stereo video database [Online]. Available: http://grouper.ieee.org/groups/3dhf/

33. IVY LAB Stereoscopic video data set [Online]. Available: http://ivylab.kaist.ac.kr/demo/ivy3D-LocalMotion/index.htm

34. After Effects software [Online]. Available: http://supportdownloads.adobe.com/product.jsp?platform=Windows&product=13

35. ITU-R BT.500–11, “Methodology for the subjective assessment of the quality of television pictures,” (2002).

36. ITU-R BT.1438, “Subjective assessment of stereoscopic television pictures,” (2000).

37. SPSS software [Online]. Available: http://www.ibm.com/analytics/us/en/technology/spss/

38. J. Chen, J. Zhou, J. Sun, and A. C. Bovik, “3D visual discomfort prediction using low complexity disparity algorithms,” EURASIP J. Image Video Process. 2016(1), 23 (2016). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1
Fig. 1 Comfortable ranges for horizontal disparity.
Fig. 2
Fig. 2 Geometric description of stereoscopic display.
Fig. 3
Fig. 3 Geometric description of calculating the motion (Ref [29], Fig. 10.3).
Fig. 4
Fig. 4 The zone of comfort in diopter (Ref [12], Fig. 23).
Fig. 5
Fig. 5 Shapes of the model1 and the model2 for horizontal disparity.
Fig. 6
Fig. 6 Disparity distribution of 25 stereoscopic videos in IEEE-SA stereo video database:(a) horizontal disparity and (b) vertical disparity.
Fig. 7
Fig. 7 Examples of two frames of experimental sequences with horizontal disparity of 2°: (a) Car1 and (b) Restaurant1.
Fig. 8
Fig. 8 Examples of two frames of experimental sequences with vertical disparity of 1°:(a) Car1 and (b) Restaurant1.
Fig. 9
Fig. 9 Experimental results of subjective and objective visual comfort evaluation scores for horizontal disparity: (a) the model1, (b) the model2. Error bars indicate the standard deviation of all evaluation scores under a given condition.
Fig. 10
Fig. 10 Experimental results of subjective and objective visual comfort evaluation scores for vertical disparity: (a) the model1, (b) the model2. Error bars indicate the standard deviation of all evaluation scores under a given condition.
Fig. 11
Fig. 11 Experimental results of subjective and objective visual comfort evaluation scores for motion: (a) planar motion, (b) depth motion and (c) composite motion. Error bars indicate the standard deviation of all evaluation scores under a given condition.
Fig. 12
Fig. 12 Experimental results of the whole evaluation method.

Tables (6)

Tables Icon

Table 1 Coefficients of two models for horizontal disparity.

Tables Icon

Table 2 Coefficients of two models for vertical disparity.

Tables Icon

Table 3 Coefficients of three models for motion.

Tables Icon

Table 4 Results of verification.

Tables Icon

Table 5 Regression coefficients for five tests

Tables Icon

Table 6 Performance comparisons of different visual comfort objective evaluation methods.

Equations (34)

Equations on this page are rendered with MathJax. Learn more.

d h = X r X l ,
d v = Y r Y l .
X s =( X i R h 2 ) width R h , Y s =( Y i R v 2 ) height R v .
X d = I X s I d sh ( x,y ) , Y d = I Y s I d sh ( x,y ) , Z d = d sh ( x,y )D d sh ( x,y )I ,
d sh ( x,y )= d h ( x,y ) width R h .
d ah ( x,y )=2[ arctan( I 2( D Z d ) )arctan( I 2D ) ].
d av ( x,y )= cos 1 ( a b | a || b | ) cos 1 ( c d | c || d | ) = cos 1 ( Y dl Y dr + D 2 Y dl 2 + D 2 Y dr 2 + D 2 ),
a = P el P dl =( X e I/2 X dl , Y e Y dl , Z e Z dl ),
b = P er P dr =( X e +I/2 X dr , Y e Y dr , Z e Z dr ),
c = P el ( P dl + P dr )/2 , d = P er ( P dl + P dr )/2 .
M 2 D x ( x r k , y r k ,k )= x r k+1 x r k .
M 3 D x ( x r k , y r k ,k )= M 2 D x ( x r k , y r k ,k ) 1 2 M 3 D d ( x r k , y r k ,k ),
M 3 D y ( x r k , y r k ,k )= M 2 D y ( x r k , y r k ,k )= y r k+1 y r k ,
M 3 D d ( x r k , y r k ,k )=d( x r k + M 2 D x ( x r k , y r k ,k ), y r k + M 2 D y ( x r k , y r k ,k ),k+1 )d( x r k , y r k ,k ) =d( x r k+1 , y r k+1 ,k+1 )d( x r k , y r k ,k ).
m k,x =f( cos 1 ( e k+1 x,z e k x,z | e k+1 x,z || e k x,z | ) 1 2f m k,z ),
m k,y =f cos 1 ( e k+1 y,z e k y,z | e k+1 y,z || e k y,z | ),
m k,z =2f[ arctan( I 2( D Z d k+1 ) )arctan( I 2( D Z d k ) ) ],
e k = P r k P er =( x r k ( x e I/2 ), y r k y e , z r k z e ),
e k x,z =( x r k ( x e I/2 ), z r k z e ),
e k y,z =( y r k y e , z r k z e ),
Z d k = d sh k ( x,y )D d sh k ( x,y )I .
m planar ( x,y )= ( m k,x ) 2 + ( m k,y ) 2 .
m depth ( x,y )=| m k,z |.
w dh 1 ( x,y )={ exp( th| d ah ( x,y ) | )| d ah ( x,y ) |th 10<| d ah ( x,y ) |<th ,
w dh 2 ( x,y )={ exp( d ah ( x,y )t h n ) d ah ( x,y )t h n exp( t h p d ah ( x,y ) ) d ah ( x,y )t h p 1t h n < d ah ( x,y )<t h p ,
t h n =2[ arctan( I( 1 T n D ) 2× m n D )arctan( I 2D ) ],
t h p =2[ arctan( I( 1+ T p D ) 2× m p D )arctan( I 2D ) ].
w dv 1 ( x,y )=exp( d av ( x,y ) ),
w dv 2 ( x,y )={ exp( t h v | d av ( x,y ) | )| d av ( x,y ) |t h v 10<| d av ( x,y ) |<t h v .
w m ( x,y )={ exp( m planar ( x,y ) σ 1 )planarmotion exp( m depth ( x,y ) σ 2 )depthmotion exp( m planar ( x,y )+ m depth ( x,y ) σ 3 )compositemotion .
v c k = 1 N i=1 N [ a w dh i ( x,y )+b w dv i ( x,y )+c ] .
VC= 1 M k=1 M v c k ,
v c k = 1 N i=1 N [ α w dh i ( x,y )+β w dv i ( x,y )+γ w m i ( x,y )+δ ] .
{ ( P i + Q i )/ T i >5% | P i Q i |/ ( P i + Q i )<30% .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.