Drone Operation with Human Natural Movement

Hironori Hiraishi

Drones Auton. Veh. ›› 2025, Vol. 2 ›› Issue (3) : 10011

PDF (1258KB)
Drones Auton. Veh. ›› 2025, Vol. 2 ›› Issue (3) :10011 DOI: 10.70322/dav.2025.10011
Article
research-article
Drone Operation with Human Natural Movement
Author information +
History +
PDF (1258KB)

Abstract

This study proposes a method for operating drones using natural human movements. The operator simply wears virtual reality (VR) goggles. An image from the drone camera was displayed on the goggles. When the operator changes the direction of his or her face, the drone changes the direction to match that of the operator. When the operator moves their head up or down, the drone rises or falls accordingly. When the operator walks in place, rather than walking, the drone moves forward. This allows the operator to control the drone as if they were walking in the air. Each of these movements was detected by the values of the acceleration and magnetic field sensors of the smartphone mounted on the VR goggles. A machine learning method was adopted to distinguish between walking and non-walking movements. Compared with operation via conventional remote control, it was observed that the remote controller performed better than the proposed approach in the early stages. However, when the participants familiarized themselves with the natural operation, these differences became relatively small. This study combined drones, VR, and machine learning. VR provides drone pilots with a sense of realism and immersion, whereas machine learning enables the use of natural movements.

Keywords

Drone / Virtual reality / Human computer interaction / Natural user interface / Machine learning / Support vector machine

Cite this article

Download citation ▾
Hironori Hiraishi. Drone Operation with Human Natural Movement. Drones Auton. Veh., 2025, 2(3): 10011 DOI:10.70322/dav.2025.10011

登录浏览全文

4963

注册一个新账户 忘记密码

Acknowledgments

We would like to thank Editage (www.editage.jp) for English language editing.

Ethics Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

There are no specific research data related to this paper. A video showing the results of this research can be accessed at https://youtu.be/SNwuBOT4yog?si=W-EoSh3KaT_RMjQ7 (accessed on 10 May 2025).

Funding

This research received no external funding.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

[1]

Amazon Prime Air. Available online: https://www.aboutamazon.com/news/transportation/amazon-prime-air-prepares-for-drone-deliveries (accessed on 18 March 2025).

[2]

Restas A. Drone Applications for Supporting Disaster Management. World J. Eng. Technol. 2015, 3, 316-321.

[3]

Veroustraete F. The Rise of the Drones in Agriculture. EC Agric. 2015, 2, 325-327.

[4]

Ostojić G, Stankovski S, Tejić B, Đukić N, Tegeltija S. Design, control and application of quadcopter. Int. J. Ind. Eng. Manag. 2015, 6, 43-48.

[5]

Gowda M, Manweiler J, Dhekne A, Choudhury RR, Weisz JD. Tracking drone orientation with multiple GPS receivers. In Proceedings of the 22nd Annual International Conference on Mobile Computing and Networking, New York, NY, USA, 3-7 October 2016; pp. 280-293.

[6]

Gadda JS, Patil RD. Quadcopter (UAVs) for border security with GUI system. Int. J. Res. Eng. Technol. 2013, 2, 620-624.

[7]

Hanafi D, Qetkeaw M, Ghazali R, Than MN, Utomo WM, Omar R. Simple GUI Wireless Controller of Quadcopter. Int. J.Commun. Netw. Syst. Sci. 2013, 6, 52-59.

[8]

Hussein A, Al-Kaff A, de la Escalera A, Armingol JM. Autonomous Indoor Navigation of Low-Cost Quadcopters. In Proceedings of the IEEE International Conference on Service Operations And Logistics, and Informatics (SOLI), Yasmine Hammamet, Tunisia, 15-17 November 2015; pp.133-138.

[9]

Mac TT, Copot C, De Keyser R, Ionescu CM. The development of an autonomous navigation system with optimal control of an UAV in partly unknown indoor environment, Mechatronics 2018, 49, 187-196.

[10]

Krajník T, Nitsche M, Pedre S, Přeučil L, Mejail ME. A simple visual navigation system for an UAV. In Proceedings of the 9th International Multi-Conference on Systems, Sygnals & Devices, Chemnitz, Germany, 20-23 March 2012.

[11]

Sani MF, Karimian G. Automatic navigation and landing of an indoor AR. drone quadrotor using ArUco marker and inertial sensors. In Proceedings of the International Conference on Computer and Drone Applications (IConDA), Kuching, Malaysia, 9-11 November 2017; pp. 102-107.

[12]

Polvara R, Sharma S, Wan J, Manning A, Sutton R. Towards autonomous landing on a moving vessel through fiducial markers. In Proceedings of the European Conference on Mobile Robots (ECMR), Paris, France, 6-8 September 2017; pp. 1-6.

[13]

Hiraishi H. Maneuvering a Drone with Natural Human Movement. In Proceedings of the 1st International Conference on Drones and Unmanned Systems (DAUS’ 2025), Granada, Spain, 19-21 February 2025; pp. 8-10.

[14]

Gio N, Brisco R, Vuletic T. CONTROL OF A DRONE WITH BODY GESTURES. Proc. Des. Soc. 2021, 1, 761-770.

[15]

Di Vincenzo M, Palini F, De Marsico M, Borghi AM, Baldassarre G. A Natural Human-Drone Embodied Interface: Empirical Comparison With a Traditional Interface. Front. Neurorobot. 2022, 16, 898859.

[16]

LO CY, HOU JH. A Natural Human-Drone Interface For Better Spatial Presence Experiences. In Proceedings of the 27th International Conference of the Association for Computer-Aided Architectural Design Research in Asia (CAADRIA), Sydney, Australia, 9-15 April 2022; pp. 99-108.

[17]

Tello SDK 1.3.0.0. Available online: https://www.ryzerobotics.com/jp/tello/downloads (accessed on 18 March 2025).

[18]

Chang CC, Lin CJ. LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2011, 2, 1-27.

PDF (1258KB)

26

Accesses

0

Citation

Detail

Sections
Recommended

/