PDF(1182 KB)
Intelligent Fusion Autonomous Navigation Method for Mars Precise Landing
- GAO Xizhen1,2, HUANG Xiangyu1,2, XU Chao1,2
Author information
+
1. Beijing Institute of Control Engineering, Beijing 100194, China;
2. National Key Laboratory of Space Intelligent Control Technology, Beijing 100194, China
Show less
History
+
Received |
Revised |
Published |
06 Apr 2023 |
19 Jun 2023 |
26 Mar 2024 |
Issue Date |
|
26 Mar 2024 |
|
Abstract
To overcome the difficulty of absolute optical navigation in unknown environments, an intelligent fusion autonomous navigation method for Mars precise landing was proposed. Considering the difficulties of the inability to detect features and the low efficiency of recognition brought by high texture similarity in the extraterrestrial environment and perspective scaling between images, an unsupervised homography network was constructed to estimate the inter frame motion of the lander. Based on the inertial measurement information, a recursive model of the lander state was established. Using the established measurement model and state recursive model, real-time estimation of the lander position, velocity, and attitude was achieved through UKF. The simulation results verify the effectiveness of the proposed method without the need of feature detection and matching.
Keywords
Mars landing /
intelligent navigation /
multi-source fusion /
unsupervised learning /
deep neural network
Cite this article
Download citation ▾
GAO Xizhen, HUANG Xiangyu, XU Chao.
Intelligent Fusion Autonomous Navigation Method for Mars Precise Landing. Journal of Deep Space Exploration, 2024, 11(1): 24‒30 https://doi.org/10.15982/j.issn.2096-9287.2024.20230041
{{custom_sec.title}}
{{custom_sec.title}}
{{custom_sec.content}}
This is a preview of subscription content, contact
us for subscripton.
References
[1] 崔平远,高锡珍,朱圣英,等. 行星着陆复杂形貌特征匹配与自主导航研究进展[J]. 宇航学报,2022,43(6):713-722.
CUI P Y,GAO X Z,ZHU S Y,et al. Progress in complex topography feature matching and autonomous navigation for planetary landing[J]. Journal of Astronautics,2022,43(6):713-722.
[2] LIGHTSEY G E,CHRISTIAN J A. Onboard image-processing algorithm for a spacecraft optical navigation sensor system[J]. Canadian Journal of Occupational Therapy,2013,78(1):37-44.
[3] DU S,WANG M,CHEN X,et al. A high-accuracy extraction algorithm of planet centroid image in deep-space autonomous optical navigation[J]. The Journal of Navigation,2016,69(4):828-844.
[4] JOHNSON A,CHENG Y,MONTGOMERY J. Real-time terrain relative navigation test results from a relevant environment for Mars landing[C]// Proceedings of AIAA Guidance,Navigation,and Control Conference. Kissimmee,Florida:AIAA,2015.
[5] NASA. New technologies for Mars exploration. [EB/OL]. (2020-10-04)[2023-04-01].https://mars.nasa.gov/mars2020/mission/technology/.
[6] CHENG Y,GOGUEN J,JOHNSON A,et al. The Mars exploration rovers descent image motion estimation system[J]. IEEE Intelligent Systems,2004,19(3):13-21.
[7] CHENG,Y,MILLER,J K. Autonomous landmark based spacecraft navigation system[C]//Proceedings of AAS/AIAA Astrodynamics Specialist Conference. Ponce,Puerto Rico:AIAA,2003.
[8] CHENG Y,ANSAR A. Landmark based position estimation for pinpoint landing on Mars[C]//Proceedings of 2005 IEEE International Conference on Robotics and Automation. Barcelona,Spain:IEEE,2005:4470-4475.
[9] CHENG Y,ANSAR A. A landmark based pinpoint landing simulator[C]//Proceedings of 7th International Symposium on Artificial Automation and Robotics in Space. Nara,Japan:[s. n.],2003.
[10] JOHNSON A ,MATTHIES L ,TRAWNY M ,et al. A general approach to terrain relative navigation for planetary landing[C]//Proceedings of AIAA Aerospace Conference. [S. l.]:AIAA,2007:1498-1506.
[11] MOURIKIS A,TRAWNY N,ROUMELIOTS S,et al. Vision-aided inertial navigation for spacecraft entry,descent,and landing[J]. IEEE Transactions on Robotics,2009,25(2):264-280.
[12] 高锡珍,汤亮,黄煌. 深度强化学习技术在地外探测自主操控中的应用与挑战[J]. 航空学报,2023,44(6):35-39.
GAO X Z,TANG L,HUANG H. Application and challenges of deep reinforcement learning in autonomous manipulation for celestial bodies exploration[J]. Acta Aeronauticaet Astronautica Sinica,2023,44(6):35-39.
[13] CUI P,GAO X,ZHU S,et al. Visual navigation based on curve matching for planetary landing in unknown environments[J]. Acta Astronautica,2020,170:261-274.
[14] ZHOU T,BROWN M,SNAVELY N,et al. Unsupervised learning of depth and ego-motion from video[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. [S. l.]:IEEE,2017:1851-1858.
[15] GEIGER A,LENZ P,STILLER C,et al. Vision meets robotics:the kitti dataset[J]. The International Journal of Robotics Research,2013,32(11):1231-1237.