A dynamic detection method to improve SLAM performance

Yu Gan, Jianhua Zhang, Kaiqi Chen, Jialing Liu

Optoelectronics Letters ›› 2021, Vol. 17 ›› Issue (11) : 693-698.

Optoelectronics Letters ›› 2021, Vol. 17 ›› Issue (11) : 693-698. DOI: 10.1007/s11801-021-1022-5
Article

A dynamic detection method to improve SLAM performance

Author information +
History +

Abstract

Simultaneous localization and mapping (SLAM) technology is a research hotspot in the field of intelligent mobile robot, and many researchers have developed many classic systems in the past few decades. However, most of the existing SLAM methods assume that the environment of the robot is static, which results in the performance of the system being greatly reduced in the dynamic environment. To solve this problem, a new dynamic object detection method based on point cloud motion analysis is proposed and incorporated into ORB-SLAM2. First, the method is regarded as a preprocessing stage, detecting moving objects in the scene, and then removing the moving objects to enhance the performance of the SLAM system. Experiments performed on a public RGB-D dataset show that the motion cancellation method proposed in this paper can effectively improve the performance of ORB-SLAM2 in a highly dynamic environment.

Cite this article

Download citation ▾
Yu Gan, Jianhua Zhang, Kaiqi Chen, Jialing Liu. A dynamic detection method to improve SLAM performance. Optoelectronics Letters, 2021, 17(11): 693‒698 https://doi.org/10.1007/s11801-021-1022-5

References

[1]
KerlC, SturmJ, CremersD. Dense visual SLAM for RGB-D cameras[C], 2013, New York, IEEE: 2100-2106
[2]
Mur-ArtalR, TardosJ D. ORB-SLAM2: an open-source SLAM system for monocular, stereo and RGB-D cameras[J]. IEEE transactions on robotics, 2017, 33(5):1255-1262
CrossRef Google scholar
[3]
DewanA, CaselitzT, TipaldiG D, et al.. Motion-based detection and tracking in 3D LiDAR scans[C], 2016, New York, IEEE: 4508-4513
[4]
LuZ, HuZ, UchimuraK. SLAM estimation in dynamic outdoor environments[J]. International journal of humanoid robotics, 2010, 7(2):315-330
CrossRef Google scholar
[5]
SunY, LiuM, MengM Q H. Improving RGB-D SLAM in dynamic environments: a motion removal approach[J]. Robotics and autonomous systems, 2017, 89: 110-122
CrossRef Google scholar
[6]
BescósB, FácilJM, CiveraJ, et al.. Dynaslam: tracking, mapping and inpainting in dynamic scenes[J]. IEEE robotics and automation letters, 2018, 3(4):4076-4083
CrossRef Google scholar
[7]
JiangC, PaudelD P, FougerolleY, et al.. Static and dynamic objects analysis as a 3D vector field[C], 2017, New York, IEEE: 234-243
[8]
JaimezM, KerlC, Gonzalez-JimenezJ, et al.. Fast odometry and scene flow from RGB-D cameras based on geometric clustering[C], 2017, New York, IEEE: 3992-3999
[9]
SconaR, JaimezM, PetillotY R, et al.. Staticfusion: background reconstruction for dense RGB-D slam in dynamic environments[C], 2018, New York, IEEE: 3849-3856
[10]
YangS, SchererS. CubeSLAM: monocular 3D object SLAM[J]. IEEE transactions on robotics, 2019, 35(4):925-938
CrossRef Google scholar
[11]
HuangJ, YangS, MuT J, et al.. ClusterVO: clustering moving instances and estimating visual odometry for self and surroundings[C], 2020, New York, IEEE: 2168-2177
[12]
ZHANG J, HENEIN M, MAHONY R, et al. VDO-SLAM: a visual dynamic object-aware SLAM system[EB/OL]. (2020-05-25) [2021-09-13]. https://arxiv.org/abs/2005.11052.
[13]
BescosB, CamposC, TardósJ D, et al.. Dynaslam II: tightly-coupled multi-object tracking and SLAM[J]. IEEE robotics and automation letters, 2021, 6(3):5191-5198
CrossRef Google scholar
[14]
DUBÉ R, DUGAS D, STUMM E, et al. Segmatch: segment based loop-closure for 3D point clouds[EB/OL]. (2019-01-15) [2021-09-13]. https://arxiv.org/abs/1609.07720v2.
[15]
SturmJ, EngelhardN, EndresF, et al.. A benchmark for the evaluation of RGB-D SLAM systems[C], 2012, New York, IEEE: 573-580

Accesses

Citations

Detail

Sections
Recommended

/