Abstract
Stereo cameras are widely used in wearable visually impaired assistance devices (VIADs). However, the inevitable vibration, shock, and mechanical stress may make the camera pair become misaligned and cause a sharp decline in the quality of the acquired depth map, which significantly influences the performance of VIADs. In this paper, we propose an epipolar-constraint-based unconstrained self-calibration method that requires neither user involvement nor specific environment, while achieving a rotation accuracy of 0.83 mrad and a translation accuracy of 0.42 mm. Several approaches are proposed to address the image matching issues, including blurred images removal, mismatched key points removal, etc. Based on correctly matched key point pairs, a planar quadric-distribution approach is proposed to ensure the quality and consistency of the final key point group. These collection approaches ensure the reliability of key point pairs, which is the most important factor to realize high accuracy with minimum constraint. A comprehensive set of experiments demonstrates the high robustness of the proposed methods, which are suitable for VIADs. We also present a field test with blindfolded users to validate the flexibility and applicability of the approach.
© 2019 Optical Society of America
Full Article | PDF ArticleMore Like This
Banglei Guan, Yingjian Yu, Ang Su, Yang Shang, and Qifeng Yu
Appl. Opt. 58(31) 8511-8521 (2019)
Banglei Guan, Yang Shang, and Qifeng Yu
Appl. Opt. 56(33) 9257-9267 (2017)
Mao Yang, Xiaobo Chen, and Chengyi Yu
Appl. Opt. 58(31) 8362-8370 (2019)