Prior Pose Correction for Asteroid Landing Based on Feature Tracking Assistance

WANG Yaqiong1,2, XIE Huan1,2, YAN Xiongfeng1,2, WANG Yifan1,2, CHEN Jie1,2, TONG Xiaohua1,2

PDF(3246 KB)
PDF(3246 KB)
Journal of Deep Space Exploration ›› 2024, Vol. 11 ›› Issue (3) : 274-285. DOI: 10.15982/j.issn.2096-9287.2024.20230151
Special Issue:Intelligent Landing on Small Celestial Bodies

Prior Pose Correction for Asteroid Landing Based on Feature Tracking Assistance

  • WANG Yaqiong1,2, XIE Huan1,2, YAN Xiongfeng1,2, WANG Yifan1,2, CHEN Jie1,2, TONG Xiaohua1,2
Author information +
History +

Abstract

Aiming at the possible inaccuracy of the prior pose during the visual navigation of an asteroid landing,a feature tracking aided pose estimation method is proposed. First,the generation of navigation features relies on pre-existing pose information and a database of navigation features. Subsequently,a multi-feature discriminative correlation filter (DCF) is employed to track the position of the navigation features in the navigation camera images by combining handcrafted and depth features. The average peak correlation energy (APCE) is subsequently employed to effectively screen dependable tracking outcomes for the initial estimation of the pose. Finally,the navigation features are recalculated using the initial estimation of the pose and adjusted to match with the navigation camera image by using normalized correlation coefficients (NCC). The proposed methodology involves the integration of the process within a differentiable Levenberg-Marquardt (LM) framework, specifically designed for pose optimization. This framework incorporates constraints based on the NCC. Experimental results, utilizing images,terrain, and ephemeris data obtained from the Osiris mission,demonstrate that the proposed method's pose estimation exhibits reprojection errors within the sub-pixel range. At 1 km from the asteroid surface,the position estimation error is within 2 m and the attitude estimation error is within 1°.

Keywords

asteroid pointing attachment / visual navigation / pose estimation / feature tracking / pose optimization

Cite this article

Download citation ▾
WANG Yaqiong, XIE Huan, YAN Xiongfeng, WANG Yifan, CHEN Jie, TONG Xiaohua. Prior Pose Correction for Asteroid Landing Based on Feature Tracking Assistance. Journal of Deep Space Exploration, 2024, 11(3): 274‒285 https://doi.org/10.15982/j.issn.2096-9287.2024.20230151

References

[1] 张荣桥,黄江川,赫荣伟,等. 小行星探测发展综述[J]. 深空探测学报(中英文),2019,6(5):417-423,455.
ZHANG R Q,HUANG J C,HE R W,et al. The development overview of asteroid exploration[J]. Journal of Deep Space Exploration,2019,6(5):417-423,455.
[2] VILAS F. Spectral characteristics of Hayabusa 2 near-Earth asteroid targets 162173 1999 JU3 and 2001 QC34[J]. The Astronomical Journal,2008,135(4):1101.
[3] 崔平远,贾贺,朱圣英,等. 小天体光学导航特征识别与提取研究进展[J]. 宇航学报,2020,41(7):880-888.
CUI P Y,JIA H,ZHU S Y,et al. Research progress on optical navigation feature recognition and extraction technologies for small body exploration[J]. Journal of Astronautics,2020,41(7):880-888.
[4] MÜLLER T G,ĎURECH J,ISHIGURO M,et al. Hayabusa-2 mission target asteroid 162173 Ryugu (1999 JU3):searching for the object’s spin-axis orientation[J]. Astronomy & Astrophysics,2017,599:A103.
[5] SCHEERES D J,HESAR S G,TARDIVEL S,et al. The geophysical environment of Bennu[J]. Icarus,2016,276:116-140.
[6] OLDS R,MAY A,MARIO C,et al. The application of optical based feature tracking to OSIRIS-REx asteroid sample collection[C]//Proceedings of AAS Guidance,Navigation,& Control Conference. [S. l.]:AAS,2015:15-124.
[7] HANG X Y,CUI H T,CUI P Y. An autonomous optical navigation and guidance for soft landing on asteroids[J]. Acta Astronautica,2004,54(10):763-771.
[8] CHEUNG W,HAMARNEH G. N-dimensional scale invariant feature transform[J]. IEEE Transactions on Image Processing,2009,18(9):2012-2021.
[9] RUBLEE E,RABAUD V,KONOLIGE K,et al. ORB:an efficient alternative to SIFT or SURF[C]//Proceedings of 2011 International conference on computer vision. [S. l.]:IEEE,2011:2564-2571.
[10] BAY H,TUYTELAARS T,VAN G. Surf:speeded up robust features[C]//Proceedings of Computer Vision–ECCV 2006:9th European Conference on Computer Vision. Graz,Austria:[s. n.],2006.
[11] SHAO W,XIE J C,CAO L,et al. Crater matching algorithm based on feature descriptor[J]. Advances in Space Research,2019,65(1):616-629.
[12] CUI P Y,GAO X Z,ZHU S Y,et al. Visual navigation using edge curve matching for pinpoint planetary landing[J]. Acta Astronautica,2018,146:171-180.
[13] BILODEAU V S,NEVEU D,BRUNEAU D S,et al. Pinpoint lunar landing navigation using crater detection and matching:design and laboratory validation[C]//Porceedings of AIAA Guidance,Navigation,and Control Conference. [S. l.]:AIAA,2012.
[14] 冯军华,崔祜涛,崔平远,等. 行星表面陨石坑检测与匹配方法[J]. 航空学报,2010,31(9):1858-1863.
FENG J H,CUI H T,CUI P Y,et al. Autonomous crater detection and matching on planetary surface[J]. Acta Aeronautica et Astronautica Sinica,2010,31(9):1858-1863.
[15] TAKASHI K,TATSUAKI H,SHUJIRO S,et al. An autonomous navigation and guidance system for MUSES-C asteroid landing[J]. Acta Astronautica,2003,52(2-6):125-131.
[16] JOHNSON A,AARON S,CHANG J,et al. The lander vision system for mars 2020 entry descent and landing[J]. Guidance,Navigation,and Control,2017,159(17):435-445.
[17] DEVER C,HAMILTON L,TRUAX R,et al. Guided-airdrop vision-based navigation[C]//Proceedingsof 24th AIAA Aerodynamic Decelerator Systems Technology Conference. [S. l.]:AIAA,2017:3723.
[18] DOWNES L M,STEINER T J,HOW J P. Neural network approach to crater detection for lunar terrain relative navigation[J]. Journal of Aerospace Information Systems,2021,18(7):391-403.
[19] REIF K,GUNTHER S,YAZ E,et al. Stochastic stability of the discrete-time extended Kalman filter[J]. IEEE Transactions on Automatic control,1999,44(4):714-728.
[20] DANELLJAN M,HAGER G,SHAHBAZ K F,et al. Learning spatially regularized correlation filters for visual tracking[C]//Proceedings of the IEEE international Conference on Computer Vision. [S. l.]:IEEE,2015:4310-4318.
[21] WU Y Z,HAPKE B. Spectroscopic observations of the Moon at the lunar surface[J]. Earth and Planetary Science Letters,2018,484:145-153.
[22] BOLME D S,BEVERIDGE J R,DRAPER B A,et al. Visual object tracking using adaptive correlation filters[C]//Proceedings of 2010 IEEE computer society conference on computer vision and pattern recognition. [S. l.]:IEEE,2010:2544-2550.
[23] LI S,XU C,XIE M. A robust O(n) solution to the perspective-n-point problem[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2012,34(7):1444-1450.
[24] MCEWEN A S. Photometric functions for photoclinometry and other applications[J]. Icarus,1991,92(2):298-311.
[25] CHATFIELD K,SIMONYAN K,VEDALDI A,et al. Return of the devil in the details:Delving deep into convolutional nets[EB/OL]. (2014-11-5)[2023-10-31]. https://arxiv.org/abs/1405.3531.
[26] SIMONYAN K,ZISSERMAN A. Very deep convolutional networks for large-scale image recognition[J]. (2015-4-15)[2023-10-31]. https://arxiv.org/abs/1409.1556.
[27] ALAN L,VOJÍŘ T,ČEHOVIN L,et al. Discriminative correlation filter tracker with channel and spatial reliability[J]. International Journal of Computer Vision,2018,126(7):671-688.
[28] ACTON J C H. Ancillary data services of NASA's navigation and ancillary information facility[J]. Planetary and Space Science,1996,44(1):65-70.
[29] 王亚琼. 地外天体着陆视觉导航陆标构建与匹配方法研究[D]. 上海:同济大学,2023.
WANG Y Q. Research on the construction and matching method of extraterrestrial landing visual navigation landmarks [D]. Shanghai:Tongji University,2023.
[30] CHATFIELD K,SIMONYAN K,VEDALDI A,et al. Return of the devil in the details:delving deep into convolutional nets[J]. (2014)[2023-10-31].https://arxiv.org/abs/1405.3531.
[31] ZUO X,XIE X,LIU Y,et al. Robust visual SLAM with point and line features[C]//Proceedings of 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). [S. l.]:IEEE,2017:1775-1782.
[32] LORENZ D A,OLDS R,MAY A,et al. Lessons learned from OSIRIS-REx autonomous navigation using natural feature tracking[C]//Proceedings of 2017 IEEE Aerospace Conference. [S. l.]:IEEE,2017:1-12.
PDF(3246 KB)

Accesses

Citations

Detail

Sections
Recommended

/