PDF
(4882KB)
Abstract
Motivated by the goal of enhancing the accuracy and robustness of visual inertial navigation systems(VINSs) across a wide spectrum of dynamic scenarios, protracted missions and expansive navigation ranges, we designed a monocular visual inertial odometry (VIO) augmented by planar environmental constraints. To attain efficient feature extraction and precise feature tracking, we employed a methodology that involved the extraction and tracking of uniformly distributed using features from accelerated segment test(FAST) feature points from video images, with the subsequent removal of outliers through symmetric optical flow. Additionally, we outlined the process of identifying coplanar feature points from the sparse feature set, enabling efficient plane detection and fitting. This approach constructed spatial geometric constraints on the three-dimensional coordinates of visual feature points without resorting to computationally expensive dense depth mapping. The heart of this method lied in the formulation of a comprehensive cost function, which integrated the reprojection error of visual feature points, the coordinate constraints derived from coplanar feature points, and the inertial measurement unit(IMU) pre-integration error. These integrated measurements were then utilized to estimate the system states through a nonlinear optimization methodology. To validate the accuracy and effectiveness of the proposed approach, extensive experiments were conducted using publicly available datasets and large-scale outdoor scenes. The experimental results conclusively demonstrate that compared to VINS-Mono and ORB-SLAM3, the proposed method achieves higher positioning accuracy. It can deliver precise and stable navigation results even in challenging conditions, thereby imparting significant practical value to the fields of robotics and unmanned driving.
Keywords
visual inertial odometry(VIO)
/
planar environmental constraint
/
state estimation
/
nonlinear optimization
Cite this article
Download citation ▾
Jingyun DUO, Yilin ZHAO, Long ZHAO, Juntao LI.
Planar environmental constraints aided monocular visual inertial odometry.
Journal of Measurement Science and Instrumentation, 2024, 15(1): 83-94 DOI:10.62756/jmsi.1674-8042.2024009
| [1] |
DMMER G, BAUER H, Neumann R, et al. Design, additive manufacturing and component testing of pneumatic rotary vane actuators for lightweight robots. Rapid Prototyping Journal, 2022, 28(11): 20-32.
|
| [2] |
CHEN M, WU Y, HE H. A novel navigation system for an autonomous mobile robot in an uncertain environment. Robotica, 2021, 40(3): 421-446.
|
| [3] |
CUI G, LI B, TIAN W, et al. Dynamic modeling and vibration prediction of an industrial robot in manufacturing. Applied Mathematical Modelling, 2022, 105: 114-136.
|
| [4] |
LI Z, ZHAO L, QIN C, et al. WiFi/PDR integrated navigation with robustly constrained Kalman filter. Measurement Science and Technology, 2020, 31(8): 84002.
|
| [5] |
KOPPANYI Z, NAVRATIL V, XU H, et al. Using adaptive motion constraints to support UWB/IMU based navigation. Navigation, 2018, 65(2): 247-261.
|
| [6] |
POULOSE A, HAN D. Hybrid indoor localization using IMU sensors and smartphone camera. Sensors, 2019, 19(23): 5084.
|
| [7] |
GAO X, WANG R, DEMMEL N, et al. LDSO: Direct sparse odometry with loop closure//IEEE/RSJ International Conference on Intelligent Robots and Systems, October 1-5, 2018, Madrid, Spain. New York: IEEE, 2018: 2198-2204.
|
| [8] |
LEE S, CIVERA J. Loosely-coupled semi-direct monocular SLAM. IEEE Robotics and Automation Letters, 2018, 4(2): 399-406.
|
| [9] |
ZHANG M, CHEN Y, LI M. Vision-aided localization for ground robots//IEEE/RSJ International Conference on Intelligent Robots and Systems, November 4-8, 2019, Macau, China. New York: IEEE, 2019: 2455-2461.
|
| [10] |
LEUTENEGGER S, LYNEN S, BOSSE M, et al. Keyframe-based visual-inertial odometry using nonlinear optimization. International Journal of Robotics Research, 2014, 34(3): 314-334.
|
| [11] |
QIN T, LI P, SHEN S. VINS-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics, 2018, 34(4): 1004-1020.
|
| [12] |
CAMPOS C, ELVIRA R, JUAN J, et al. ORB-SLAM3: An accurate open-source library for visual, visual-inertial and multi-map SLAM. IEEE Transactions on Robotics, 2021, 37(6): 1874-1890.
|
| [13] |
SUN K, MOHTA K, PFROMMER B, et al. Robust stereo visual inertial odometry for fast autonomous flight. IEEE Robotics and Automation Letters, 2018, 3(2): 965–972.
|
| [14] |
KARAMAT T, LINS R, GIVIGI S, et al. Novel EKF-based vision/inertial system integration for improved navigation. IEEE Transactions on Instrumentation and Measurement, 2018, 67(1): 116-125.
|
| [15] |
GENEVA P, ECKENHOFF K, LEE W, et al. OpenVINS: A research platform for visual-inertial estimation//IEEE International Conference on Robotics and Automation, May 31-June 15, 2020, Paris, France. New York: IEEE, 2020 : 4666-4672.
|
| [16] |
SHI X, LI D, ZHAO P, et al. Are we ready for service robots? The OpenLORIS-scene datasets for lifelong SLAM//IEEE International Conference on Robotics and Automation, May 31-June 15, 2020, Paris, France. New York: IEEE, 2020: 3139-3145.
|
| [17] |
CHA J, JUNG J, CHUNG J, et al. Effect of wheel odometer on low-cost visual-inertial navigation system for ground vehicles//IEEE/ION Position, Location and Navigation Symposium,April 20-23, 2020, Portland, USA. New York: IEEE, 2020: 682-687.
|
| [18] |
ZHAO H, JI X, WEI D, et al. Online IMU-odometer extrinsic calibration based on visual-inertial-odometer fusion for ground vehicles//IEEE International Conference on Indoor Positioning and Indoor Navigation, September 5-7, 2022, Beijing, China. New York: IEEE, 2022: 1-8.
|
| [19] |
MAITY S, SAHA A, BHOWMICK B. Edge SLAM: Edge points based monocular visual SLAM//IEEE International Conference on Computer Vision Workshops, October 27-29, 2017, Venice, Italy. New York: IEEE, 2017: 2408-2417.
|
| [20] |
HE Y, ZHAO J, GUO Y, et al. PL-VIO: Tightly-coupled monocular visual-inertial odometry using point and line features. Sensors, 2018, 18(4):1159.
|
| [21] |
HSIAO M, WESTMAN E, KAESS M. Dense planar-inertial SLAM with structural constraints//IEEE International Conference on Robotics and Automation, May 31-June 15, 2020, Paris, France. New York: IEEE, 2020: 6521-6528.
|
| [22] |
GUO C, ROUMELIOTIS S. IMU-RGBD camera navigation using point and plane features//IEEE International Workshop on Intelligent Robots and Systems, November 3-7, 2013, Tokoyo, Japan. New York: IEEE, 2013: 3164-3171.
|
| [23] |
ZUO X, YANG Y, GENEVA P, et al. LIC-Fusion 2.0: LiDAR-inertial-camera odometry with sliding-window plane-feature tracking//IEEE International Workshop on Intelligent Robots and Systems, May 31-June 15, 2020, Paris, France. New York: IEEE, 2020: 5112-5119.
|
| [24] |
LEE W, YANG Y, HUANG G. Efficient multi-sensor aided inertial navigation with online calibration//IEEE International Conference on Robotics and Automation, May 30 -June 5, 2021, Xi'an, China. New York: IEEE, 2021: 5706-5712.
|
| [25] |
YIN W, LIU Y, SHEN C. Virtual normal: Enforcing geometric constraints for accurate and robust depth prediction. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, 44(10): 7282-7295.
|
| [26] |
LIU C, KIM K, GU J, et al. PlaneRCNN: 3D plane detection and reconstruction from a single image//IEEE/CVF Conference on Computer Vision and Pattern Recognition,June 15-29, 2019, Long Beach, CA, USA. New York: IEEE, 2019: 4450-4459.
|
| [27] |
RAM K, KHARYAL C, HARITHAS S, et al. RP-VIO: Robust plane-based visual-inertial odometry for dynamic environments//IEEE International Workshop on Intelligent Robots and Systems, September 27-October 1, 2021, Prague, Czech Republic. New York: IEEE, 2021: 9198-9205.
|
| [28] |
LI X, HE Y, LIN J, et al. Leveraging planar regularities for point line visual-inertial odometry//IEEE International Workshop on Intelligent Robots and Systems, May 31-June 15, 2020, Paris, France. New York: IEEE, 2020: 5120-5127.
|
| [29] |
ROSTEN E, DRUMMOND T. Machine learning for high-speed corner detection//The 14th European Conference on Computer Vision, October 14-16, 2016, Amsterdam, The Netherlands. Berlin: Springer, 2016: 430-443.
|
| [30] |
FORSTER C, PIZZOLI M, SCARAMUZZA D. SVO: Fast semi-direct monocular visual odometry//IEEE International Conference on Robotics and Automation, May 31- June 7, 2014, Hong Kong.China, New York: IEEE, 2014: 15-22.
|
| [31] |
FORSTER C, CARLONE L, DELLAERT F, et al. On-manifold preintegration for real-time visual-inertial odometry. IEEE Transactions on Robotics, 2017, 33(1): 1-21.
|
| [32] |
SIBLEY G, MATTHIES L, SUKHATME G. Sliding window filter with application to planetary landing. Journal of Field Robotics, 2010, 27(5): 587-608.
|
| [33] |
BURRI M, NIKOLIC J, GOHL P, et al. The EuRoC micro aerial vehicle datasets. International Journal of Robotics Research, 2016, 35(10): 1157-1163.
|