High-precision urban rail map construction based on multi-sensor fusion

Zhihong Huang , Ruipeng Gao , Zejing Xu , Yiqing Liu , Zongru Ma , Dan Tao

High-speed Railway ›› 2024, Vol. 2 ›› Issue (4) : 265 -273.

PDF (3857KB)
High-speed Railway ›› 2024, Vol. 2 ›› Issue (4) : 265 -273. DOI: 10.1016/j.hspr.2024.11.006
Research article

High-precision urban rail map construction based on multi-sensor fusion

Author information +
History +
PDF (3857KB)

Abstract

The construction of high-precision urban rail maps is crucial for the safe and efficient operation of railway transportation systems. However, the repetitive features and sparse textures in urban rail environments pose challenges for map construction with high-precision. Motivated by this, this paper proposes a high-precision urban rail map construction algorithm based on multi-sensor fusion. The algorithm integrates laser radar and Inertial Measurement Unit (IMU) data to construct the geometric structure map of the urban rail. It utilizes image point-line features and color information to improve map accuracy by minimizing photometric errors and incorporating color information, thus generating high-precision maps. Experimental results on a real urban rail dataset demonstrate that the proposed algorithm achieves root mean square errors of 0.345 and 1.033 m for ground and tunnel scenes, respectively, representing a 19.31 % and 56.80 % improvement compared to state-of-the-art methods.

Keywords

Urban rail / Multi-sensor fusion / Point-line features / Photometric error

Cite this article

Download citation ▾
Zhihong Huang, Ruipeng Gao, Zejing Xu, Yiqing Liu, Zongru Ma, Dan Tao. High-precision urban rail map construction based on multi-sensor fusion. High-speed Railway, 2024, 2(4): 265-273 DOI:10.1016/j.hspr.2024.11.006

登录浏览全文

4963

注册一个新账户 忘记密码

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgements

This research was partially supported by the Beijing Natural Science Foundation(No. L221003).

References

[1]

Z.J. Yu, H.W. Wang, F. Chen, Security of railway control systems: a survey, research issues and challenges, High. -Speed Railw. 1 (1) (2023) 6-17.

[2]

J. Zhang, M. Kaess, S. Singh, A real-time method for depth enhanced visual odometry, Auton. Robots 41 (2017) 31-43.

[3]

J. Graeter, A. Wilczynski, M. Lauer,LIMO: Lidar-monocular visual odometry, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, Madrid, 2018, pp. 7872-7879.

[4]

C.C. Chou, C.F. Chou, Efficient and accurate tightly-coupled visual-lidar SLAM, IEEE Trans. Intell. Transp. Syst. 23 (9) (2023) 14509-14523.

[5]

M.Y. Li M, A.I. Mourikis, High-precision consistent EKF-based visual-inertial odometry, Int. J. Robot. Res. 32 (6) (2013) 690711.

[6]

M. Bloesch, S. Omari, M. Hutter, et al., Robust visual inertial odometry using a direct EKF-based approach, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, Hamburg, 2015, pp. 298-304.

[7]

T. Qin, P. Li, S.J. Shen, Vins-mono: A robust and versatile monocular visual-inertial state estimator, IEEE Trans. Robot. 34 (4) (2018) 1004-1020.

[8]

A. Tagliabue, J. Tordesillas, X.Y. Cai, et al., LION:lidar-inertial observability-aware Navigator for vision-denied environments, Experimental Robotics, Springer, Cham, 2021, pp. 380-390.

[9]

C. Qin, H.Y. Ye, C.E. Pranata, et al., LINS: A lidar-inertial state estimator for robust and efficient navigation, 2020 IEEE International Conference on Robotics and Automation (ICRA), IEEE, Paris, 2020, pp. 8899-8906.

[10]

T.X. Shan, B. Englot, D. Meyers, et al., LIO-SAM: Tightly-coupled lidar inertial odometry via smoothing and mapping, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, Las Vegas, 2020, pp. 5135-5142.

[11]

K.L. Li, M. Li, U.D. Hanebeck, Towards high-performance solid-state-LiDAR-inertial odometry and mapping, IEEE Robot. Autom. Lett. 6 (3) (2021) 5167-5174.

[12]

X.X. Zuo, Y.L. Yang, P. Geneva, et al., LIC-fusion 2.0: LiDAR-inertial-camera odometry with sliding-window plane-feature tracking, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, Las Vegas, 2020, pp. 5112-5119.

[13]

W. Xu, Y.X. Cai, D.J. He, et al., FAST-LIO2: Fast direct LiDAR-inertial odometry, IEEE Trans. Robot. 38 (4) (2022) 2053-2073.

[14]

S.B. Zhao, H.R. Zhang, P. Wang, et al., Super odometry: IMU-centric LiDAR-visual- inertial estimator for challenging environments, 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, Prague, 2021, pp. 8729-8736.

[15]

T.X. Shan, B. Englot, C. Ratti, et al., LVI-SAM: Tightly-coupled lidar-visual-inertial odometry via smoothing and mapping, 2021 IEEE International Conference on Robotics and Automation (ICRA), IEEE, xi’an, 2021, pp. 5692-5698.

[16]

C.R. Zheng, Q.Y. Zhu, W. Xu, et al., FAST-LIVO: Fast and tightly-coupled sparse- direct lidar-inertial-visual odometry, 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, Kyoto, 2022, pp. 4003-4009.

[17]

J.R. Lin, F. Zhang,R3LIVE: A robust, real-time, RGB-colored, LiDAR-inertial-visual tightly-coupled state estimation and mapping package, 2022 International Conference on Robotics and Automation (ICRA), IEEE, Philadelphia, 2022, pp. 10672-10678.

[18]

Z.K. Yuan, Q.J. Wang, K. Cheng, et al., SDV-LOAM: Semi-direct visual-LiDAR odometry and mapping, IEEE Trans. Pattern Anal. Mach. Intell. 45 (9) (2023) 11203-11220.

[19]

C. Qin, H.Y. Ye, C.E. Pranata, et al., LINS: A lidar-inertial state estimator for robust and efficient navigation, 2020 IEEE International Conference on Robotics and Automation (ICRA), IEEE, Paris, 2020, pp. 8899-8906.

[20]

J. Solà, Quaternion kinematics for the error-state Kalman filter, arXiv preprint, arXiv: 1711.02508, 2017.

AI Summary AI Mindmap
PDF (3857KB)

426

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/