6-DOF motion estimation using optical flow based on dual cameras

Meng-yao Liu , Yan Wang , Lei Guo

Journal of Central South University ›› 2017, Vol. 24 ›› Issue (2) : 459 -466.

PDF
Journal of Central South University ›› 2017, Vol. 24 ›› Issue (2) : 459 -466. DOI: 10.1007/s11771-017-3448-2
Article

6-DOF motion estimation using optical flow based on dual cameras

Author information +
History +
PDF

Abstract

Because of its characteristics of simple algorithm and hardware, optical flow-based motion estimation has become a hot research field, especially in GPS-denied environment. Optical flow could be used to obtain the aircraft motion information, but the six-(degree of freedom) (6-DOF) motion still couldn’t be accurately estimated by existing methods. The purpose of this work is to provide a motion estimation method based on optical flow from forward and down looking cameras, which doesn’t rely on the assumption of level flight. First, the distribution and decoupling method of optical flow from forward camera are utilized to get attitude. Then, the resulted angular velocities are utilized to obtain the translational optical flow of the down camera, which can eliminate the influence of rotational motion on velocity estimation. Besides, the translational motion estimation equation is simplified by establishing the relation between the depths of feature points and the aircraft altitude. Finally, simulation results show that the method presented is accurate and robust.

Keywords

optical flow / motion estimation / dual cameras / six degree-of-freedom (6-DOF)

Cite this article

Download citation ▾
Meng-yao Liu, Yan Wang, Lei Guo. 6-DOF motion estimation using optical flow based on dual cameras. Journal of Central South University, 2017, 24(2): 459-466 DOI:10.1007/s11771-017-3448-2

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

LuH-m, ZhangH, ZhengZ-qiang. A review of vision-based mobile robot’s self-localization [J]. Journal of Central South University, 2009, 40(S1): 127-133

[2]

ChaoH, GuY, NapolitanoM. A survey of optical flow techniques for robotics navigation applications [J]. Journal of Intelligent & Robotic Systems, 2014, 73(1–4): 361-372

[3]

LiJ, LanJ-h, LiJie. A novel fast moving target detection method [J]. Journal of Central South University, 2013, 44(3): 978-984

[4]

GriffithsS, SaundersJ, CurtisA, BarberB, MclainT, BeardR. Obstacle and terrain avoidance for miniature aerial vehicles [C]//. Advances in Unmanned Aerial Vehicles, 2007213-244

[5]

GageikN, StrohmeierM, MontenegroS. An autonomous uav with an optical flow sensor for positioning and navigation [J]. International Journal of Advanced Robotic Systems, 201310

[6]

LippielloV, LoiannoG, SicilianoB. MAV indoor navigation based on a closed-form solution for absolute scale velocity estimation using optical flow and inertial data [C]//. Decision and Control and European Control Conference (CDC-ECC), 2011 50th IEEE Conference on Orlando, 20113566-3571

[7]

SunK, YunY, ZhouW-c, ZhouG-yue. A low-cost and robust optical flow CMOS camera for velocity estimation [C]//. Robotics and Biomimetics (ROBIO), 2013 IEEE International Conference on, 2013Shenzhen, ChinaIEEE1181-1186

[8]

HoneggerD, MeierL, TanskanenP, PollefeysM. An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications [C]//. Robotics and Automation (ICRA), 2013 IEEE International Conference on, 2013Karlsruhe, GermanyIEEE1736-1741

[9]

MammarellaM, CampaG, FravoliniM L, NapolitanoM R. Comparing optical flow algorithms using 6-DOF motion of real-world rigid objects [J]. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 2012, 42(6): 1752-1762

[10]

KehoeJ J, CauseyR S, ArvaiA, LindR. Partial aircraft state estimation from optical flow using non-model-based optimization [C]//. American Control Conference, 2006, 2006Minneapolis, USAIEEE2868-2873

[11]

KehoeJ J, WatkinsA S, CauseyR S, LindR. State estimation using optical flow from parallax-weighted feature tracking [C]//. Proceedings of the AIAA Guidance, Navigation, and Control Conference, 2006Keystone, USAAIAA5030-5045

[12]

KendoulF, FantoniI, NonamiK. Optic flow-based vision system for autonomous 3D localization and control of small aerial vehicles [J]. Robotics and Autonomous Systems, 2009, 57(6): 591-602

[13]

BakerS, ScharsteinD, LewisJ P, RothS, BlackM J, SzeliskiR. A database and evaluation methodology for optical flow [J]. International Journal of Computer Vision, 2011, 92(1): 1-31

[14]

LoweD G. Distinctive image features from scale-invariant keypoints [J]. International Journal of Computer Vision, 2004, 60(2): 91-110

[15]

LucasB D, KanadeT. An iterative image registration technique with an application to stereo vision [J]. IJCAI, 1981, 81: 674-679

[16]

BrancaA, StellaE, DistanteA. Passive navigation using focus of expansion [C]// Applications of Computer Vision, 1996. WACV'96, Proceedings 3rd IEEE Workshop on Washington D C, USA: IEEE, 199664-69

[17]

FritschJ, KuehnlT, GeigerA. A new performance measure and evaluation benchmark for road detection algorithms [C]//. International Conference on Intelligent Transportation Systems (ITSC), 2013The Hague, NetherlandsIEEE1693-1700

AI Summary AI Mindmap
PDF

105

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/