Design and control algorithm of a motion sensing-based fruit harvesting robot

Ziwen CHEN, Yuhang CHEN, Hui LI, Pei WANG

PDF(4144 KB)
PDF(4144 KB)
Front. Agr. Sci. Eng. ›› DOI: 10.15302/J-FASE-2024588
RESEARCH ARTICLE

Design and control algorithm of a motion sensing-based fruit harvesting robot

Author information +
History +

Highlights

● An optimized four-step inverse kinematic solution method ensures smooth and precise motion with minimal mechanical interference.

● The robot achieves a fast response time of 74.4 ms, with an average target-picking duration reduced to 6.5 seconds after operator training.

● The system simplifies the picking process using gesture recognition.

Abstract

In response to the demand of automatic fruit identification and harvesting, this paper presents a human-robot collaborative picking robot based on somatosensory interactive servo control. The robot system mainly consisted of four parts: picking execution mechanism, hand information acquisition system, human-machine interaction interface, and human-robot collaborative picking strategy. A six-degree-of-freedom robotic arm was designed as the picking execution mechanism. The D-H method is employed for both forward and inverse kinematic modeling of the robotic arm. A four-step inverse kinematic optimal solution selection method, including mechanical interference, correctness, rationality, and smoothness of motion, is proposed. The working principle and use of the Leap Motion controller for hand information acquisition were introduced. Data from three types of hand movements were collected and analyzed. Spatial mapping method between the Leap Motion interaction space and operating range of the robotic arm was proposed to achieve a direct correspondence between the cubic interaction box and the cylindrical space of the fan ring of the robotic arm. The test results demonstrated that the average response time of the double-click picking command was 332 ms. The average time consumption for somatosensory control targeting was 6.5 s. The accuracy rate of the picking gesture judgment was 96.7%.

Graphical abstract

Keywords

Harvesting robots / human-machine interaction / human-robot collaboration / somatosensory control / Leap Motion controller

Cite this article

Download citation ▾
Ziwen CHEN, Yuhang CHEN, Hui LI, Pei WANG. Design and control algorithm of a motion sensing-based fruit harvesting robot. Front. Agr. Sci. Eng., https://doi.org/10.15302/J-FASE-2024588

References

[1]
Liu J. Research progress analysis of robotic harvesting technologies in green house. Transactions of the Chinese Society for Agricultural Machinery, 2017, 48(12): 1−18 (in Chinese)
[2]
Song J, Zhang T, Xu L, Tang X. Research actuality and prospect of picking robot for fruits and vegetables. Transactions of the Chinese Society for Agricultural Machinery, 2006, 37(5): 158−162 (in Chinese)
[3]
Hayashi S, Shigematsu K, Yamamoto S, Kobayashi K, Kohno Y, Kamata J, Kurita M . Evaluation of a strawberry-harvesting robot in a field test. Biosystems Engineering, 2010, 105(2): 160–171
CrossRef Google scholar
[4]
Xu L, Zhang T. Present situation of fruit and vegetable harvesting robot and its key problems and measures in application. Transactions of the Chinese Society of Agricultural Engineering, 2004, 20(5): 38−42 (in Chinese)
[5]
Fang J. Present situation and development of mobile harvesting robot. Transactions of the Chinese Society of Agricultural Engineering, 2004, 20(2): 273−278 (in Chinese)
[6]
Si Y, Qiao J, Liu G, Liu Z, Gao D. Recognition and shape features extraction of apples based on machine vision. Transactions of the Chinese Society for Agricultural Machinery, 2009, 40(8): 161−165 (in Chinese)
[7]
Feng J, Wang S, Liu G, Zeng L. A separating method of adjacent apples based on machine vision and chain code information. In: Li D, Chen Y, eds. Computer and Computing Technologies in Agriculture V. CCTA 2011. IFIP Advances in Information and Communication Technology, vol 368. Berlin, Heidelberg: Springer, 2012, 258–267
[8]
Miao Z, Shen Y, Wang X, Zhou X, Liu C. Imaged recognition algorithm and experiment of overlapped fruits in natural environment. Transactions of the Chinese Society for Agricultural Machinery, 2016, 47(6): 21−26 (in Chinese)
[9]
Van Henten E J, Schenk E J, van Willigenburg L G, Meuleman J, Barreiro P . Collision-free inverse kinematics of the redundant seven-link manipulator used in a cucumber picking robot. Biosystems Engineering, 2010, 106(2): 112–124
CrossRef Google scholar
[10]
Bac C W, Roorga T, Reshef R, Berman S, Hemming J, van Henten E J . Analysis of a motion planning problem for sweet pepper harvesting in a dense obstacle environment. Biosystems Engineering, 2016, 146: 85–97
CrossRef Google scholar
[11]
Barth R, Hemming J, van Henten E J . Design of eye-in-hand sensing and servo control framework for harvesting robotics in dense vegetation. Biosystems Engineering, 2016, 146: 71–84
CrossRef Google scholar
[12]
Tu J, Liu C, Li Y, Zhou J, Yuan J. Apple recognition method based on illumination invariant graph. Transactions of the Chinese Society of Agricultural Engineering, 2010, 26(S2): 26−31 (in Chinese)
[13]
Song H, Qu W, Wang D, Yu X, He D. Shadow removal method of apples based on illumination invariant image. Transactions of the Chinese Society of Agricultural Engineering, 2014, 30(24): 168−176 (in Chinese)
[14]
Huang L W, He D J . Apple recognition in natural tree canopy based on fuzzy 2-partition entropy. International Journal of Digital Content Technology and Its Applications, 2013, 7(1): 107–115
CrossRef Google scholar
[15]
Wang D, Xu Y, Song H, He D, Zhang H. Fusion of K-means and Ncut algorithm to realize segmentation and reconstruction of two overlapped apples without blocking by branches and leaves. Transactions of the Chinese Society of Agricultural Engineering, 2015, 31(10): 227−234 (in Chinese)
[16]
Wang D, Song H, He D. Research advance on vision system of apple picking robot. Transactions of the Chinese Society of Agricultural Engineering, 2017, 33(10): 59−69 (in Chinese)
[17]
Li X. The Design and Implementation of Virtual Human Body Dissection Teaching System Based on Kinect Gesture Recognition. Beijing: Beijing University of Technology, 2014 (in Chinese)
[18]
Lu H, Zhang Y, Niu M, Shui W, Zhou M. Three-dimensional virtual reassembling for broken artifact fragments based on Leap Motion. Journal of System Simulation, 2015, 27(12): 3006−3011 (in Chinese)
[19]
Liu C, Li Q. Leap Motion somatosensory controller and its application in aircraft structure display system. Computer Applications and Software, 2016, 33(4): 227−229 (in Chinese)
[20]
Liu J, Jia S, Wang P, Huo D. Study of 3D product display technology based on Leap Motion. Journal of Dalian Jiaotong University, 2016, 37(4): 110−113 (in Chinese)
[21]
Wu F, Ding Y, Ding W, Xie T. Design of human computer interaction system of virtual crops based on Leap Motion. Transactions of the Chinese Society of Agricultural Engineering, 2016, 32(23): 144−151 (in Chinese)
[22]
Xu C, Wang Q, Chen H, Mei S, Du L. Design and simulation of artificial limb picking robot based on somatosensory interaction. Transactions of the Chinese Society of Agricultural Engineering, 2017, 33(S1): 49−55 (in Chinese)
[23]
Quan L, Li C, Feng Z, Liu J. Algorithm of works’ decision for three arms robot in greenhouse based on control with motion sensing technology. Transactions of the Chinese Society for Agricultural Machinery, 2017, 48(3): 14−23 (in Chinese)
[24]
Fu D, Xu L, Li D, Xin L. Automatic Detection and Segmentation of Stems of Potted Tomato Plant Using Kinect. In: Proceedings Volume 9159, Proceedings of the Sixth International Conference on Digital Image Processing (ICDIP), Athens, Greece, 5–6 April 2014. Society of Photo-Optical Instrumentation Engineers (SPIE), 2014, 915905
[25]
Pérez-Ruíz M, Slaughter D C, Fathallah F A, Gliever C J, Miller B J . Co-robotic intra-row weed control system. Biosystems Engineering, 2014, 126: 45–55
CrossRef Google scholar
[26]
Denavit J, Hartenberg R S . A kinematic notation for lower pair mechanisms based on matrices. Journal of Applied Mechanics, 1955, 22(2): 215–221
CrossRef Google scholar
[27]
Wang W. Leap Motion Human–Computer Interaction Application Development. Xi’an: Xidian University Press, 2015 (in Chinese)
[28]
Weichert F, Bachmann D, Rudak B, Fisseler D . Analysis of the accuracy and robustness of the Leap Motion Controller. Sensors, 2013, 13(5): 6380–6393
CrossRef Google scholar

Acknowledgements

This research was supported by the Key R&D Projects in the Artificial Intelligence Pilot Area of Chongqing, China (cstc2021jscx-gksbX0067), the Fundamental Research Funds for the Central Universities (SWU-KT22024), and the Local Financial of National Agricultural Science & Technology Center, Chengdu (NASC2021KR02).

Compliance with ethics guidelines

Ziwen Chen, Yuhang Chen, Hui Li, and Pei Wang declare that they have no conflicts of interest or financial conflicts to disclose. This article does not contain any studies with human or animal subjects performed by any of the authors.

RIGHTS & PERMISSIONS

The Author(s) 2024. Published by Higher Education Press. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0)
AI Summary AI Mindmap
PDF(4144 KB)

Accesses

Citations

Detail

Sections
Recommended

/