Soft objects grasping evaluation using a novel VCFN-YOLOv8 framework

Guoshun Cui , Shiwei Su , Hanyu Gao , Kai Zhuo , Kun Yang , Hang Wu

Biomimetic Intelligence and Robotics ›› 2025, Vol. 5 ›› Issue (3) : 100232 -100232.

PDF (3378KB)
Biomimetic Intelligence and Robotics ›› 2025, Vol. 5 ›› Issue (3) :100232 -100232. DOI: 10.1016/j.birob.2025.100232
Research article
research-article
Soft objects grasping evaluation using a novel VCFN-YOLOv8 framework
Author information +
History +
PDF (3378KB)

Abstract

Humans can quickly perform adaptive grasping of soft objects by using visual perception and judgment of the grasping angle, which helps prevent the objects from sliding or deforming excessively. However, this easy task remains a challenge for robots. The grasping states of soft objects can be categorized into four types: sliding, appropriate, excessive and extreme. Effective recognition of different states is crucial for achieving adaptive grasping of soft objects. To address this problem, a novel visual-curvature fusion network based on YOLOv8 (VCFN-YOLOv8) is proposed to evaluate the grasping state of various soft objects. In this framework, the robotic arm equipped with the wrist camera and the curvature sensor is established to perform generalization grasping and lifting experiments on 11 different objects. Meanwhile, the dataset is built for training and testing the proposed method. The results show a classification accuracy of 99.51% on four different grasping states. A series of grasping evaluation experiments is conducted based on the proposed framework, along with tests for the model’s generality. The experiment results demonstrate that VCFN-YOLOv8 is accurate and efficient in evaluating the grasping state of soft objects and shows a certain degree of generalization for non-soft objects. It can be widely applied in fields such as automatic control, adaptive grasping and surgical robot.

Keywords

Grasping evaluation / Multimodal fusion / Intelligent perception / YOLOv8

Cite this article

Download citation ▾
Guoshun Cui, Shiwei Su, Hanyu Gao, Kai Zhuo, Kun Yang, Hang Wu. Soft objects grasping evaluation using a novel VCFN-YOLOv8 framework. Biomimetic Intelligence and Robotics, 2025, 5(3): 100232-100232 DOI:10.1016/j.birob.2025.100232

登录浏览全文

4963

注册一个新账户 忘记密码

CRediT authorship contribution statement

Guoshun Cui: Writing - original draft, Validation, Resources, Methodology, Formal analysis. Shiwei Su: Writing - review & editing, Visualization, Software, Formal analysis. Hanyu Gao: Writing - original draft, Visualization, Validation, Investigation, Data curation. Kai Zhuo: Writing - review & editing, Validation. Kun Yang: Writing - review & editing, Supervision, Funding acquisition, Formal analysis. Hang Wu: Writing - review & editing, Supervision, Conceptualization.

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

This work is supported by the Fundamental Research Project of Shanxi Province (202403021211229).

Appendix A. Supplementary data

Supplementary material related to this article can be found online at https://doi.org/10.1016/j.birob.2025.100232.

References

[1]

J. Sanchez, J.-A. Corrales, B.-C. Bouzgarrou, Y. Mezouar, Robotic manip-ulation and sensing of deformable objects in domestic and industrial applications: a survey, Int. J. Robot. Res. 37 (7) (2018) 688-716, http://dx.doi.org/10.1177/0278364918779698.

[2]

R. Reddy, S.R. Nagaraja, Integration of robotic arm with vision system, in: 2014 IEEE International Conference on Computational Intelligence and Computing Research, 2014, pp. 1-5, http://dx.doi.org/10.1109/ICCIC.2014.7238302.

[3]

J. Kwiatkowski, D. Cockburn, V. Duchaine, Grasp stability assessment through the fusion of proprioception and tactile signals using convolutional neural networks, in: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, 2017, pp. 286-292, http://dx.doi.org/10.1109/IROS.2017.8202170.

[4]

D. Cockbum, J.-P. Roberge, T.-H.-L. Le, A. Maslyczyk, V. Duchaine, Grasp stability assessment through unsupervised feature learning of tactile im-ages, in: 2017 IEEE International Conference on Robotics and Automation, ICRA, 2017, pp. 2238-2244, http://dx.doi.org/10.1109/ICRA.2017.7989257.

[5]

C.B. Teeple, T.N. Koutros, M.A. Graule, R.J. Wood, Multi-segment soft robotic fingers enable robust precision grasping, Int. J. Robot. Res. 39 (14) (2020) 1647-1667.

[6]

W. Zheng, Y. Xie, B. Zhang, J. Zhou, J. Zhang, Dexterous robotic grasping of delicate fruits aided with a multi-sensory e-glove and manual grasping analysis for damage-free manipulation, Comput. Electron. Agric. 190 (2021) 106472, http://dx.doi.org/10.1016/j.compag.2021.106472.

[7]

H. Ning, X. Zheng, X. Lu, Y. Yuan, Disentangled representation learning for cross-modal biometric matching, IEEE Trans. Multimed. 24 (2022) 1763-1774, http://dx.doi.org/10.1109/TMM.2021.3071243.

[8]

Y. Gan, B. Zhang, J. Shao, Z. Han, A. Li, X. Dai, Embodied intelligence: Bionic robot controller integrating environment perception, autonomous planning, and motion control, IEEE Robot. Autom. Lett. 9 (5) (2024) 4559-4566, http://dx.doi.org/10.1109/LRA.2024.3377559.

[9]

M. Popović, D. Kraft, L. Bodenhagen, E. Başeski, N. Pugeault, D. Kragic, T. Asfour, N. Krüger, A strategy for grasping unknown objects based on co-planarity and colour information, Robot. Auton. Syst. 58 (5) (2010) 551-565, http://dx.doi.org/10.1016/j.robot.2010.01.003.

[10]

J. Liang, J. Zhang, B. Pan, S. Xu, G. Zhao, G. Yu, X. Zhang, Visual recon-struction and localization-based robust robotic 6-DoF grasping in the wild, IEEE Access 9 (2021) 72451-72464, http://dx.doi.org/10.1109/ACCESS.2021.3079245.

[11]

J. Ge, L. Mao, J. Shi, Y. Jiang, Fusion-mask-RCNN: Visual robotic grasping in cluttered scenes, Multimedia Tools Appl. 83 (7) (2024) 20953-20973.

[12]

B. Luo, H. Chen, F. Quan, S. Zhang, Y. Liu, Natural feature-based visual servoing for grasping target with an aerial manipulator, J. Bionic Eng. 17 (2020) 215-228.

[13]

E. Hyttinen, D. Kragic, R. Detry, Learning the tactile signatures of prototyp-ical object parts for robust part-based grasping of novel objects, in: 2015 IEEE International Conference on Robotics and Automation, ICRA, 2015, pp. 4927-4932, http://dx.doi.org/10.1109/ICRA.2015.7139883.

[14]

A. Taghipour, A. Rostami, M. Bahrami, H. Baghban, M. Dolatyari, Compara-tive study between LPFG- and FBG-based bending sensors, Opt. Commun. 312 (2014) 99-105, http://dx.doi.org/10.1016/j.optcom.2013.09.020.

[15]

G. Gerboni, A. Diodato, G. Ciuti, M. Cianchetti, A. Menciassi, Feedback con-trol of soft robot actuators via commercial flex bend sensors, IEEE/ASME Trans. Mechatronics 22 (4) (2017) 1881-1888, http://dx.doi.org/10.1109/TMECH.2017.2699677.

[16]

Z. Shen, J. Yi, X. Li, L.H.P. Mark, Y. Hu, Z. Wang, A soft stretchable bending sensor and data glove applications, in: 2016 IEEE International Conference on Real-Time Computing and Robotics, RCAR, 2016, pp. 88-93, http://dx.doi.org/10.1109/RCAR.2016.7784006.

[17]

R.P. Rocha, P.A. Lopes, A.T. de Almeida, M. Tavakoli, C. Majidi, Fabrication and characterization of bending and pressure sensors for a soft prosthetic hand, J. Micromech. Microeng. 28 (3) (2018) 034001, http://dx.doi.org/10.1088/1361-6439/aaa1d8.

[18]

K. Huebner, K. Welke, M. Przybylski, N. Vahrenkamp, T. Asfour, D. Kragic, R. Dillmann, Grasping known objects with humanoid robots: A box-based approach,in: 2009 International Conference on Advanced Robotics, 2009, pp. 1-6.

[19]

J. Shi, J. Zheng, X. Liu, W. Xiang, Q. Zhang, Novel short-time fractional Fourier transform: Theory, implementation, and applications, IEEE Trans. Signal Process. 68 (2020) 3280-3295, http://dx.doi.org/10.1109/TSP.2020.2992865.

[20]

M. He, L. Qin, X. Deng, K. Liu, MFI-YOLO: Multi-fault insulator detection based on an improved YOLOv8, IEEE Trans. Power Deliv. 39 (1) (2024) 168-179, http://dx.doi.org/10.1109/TPWRD.2023.3328178.

[21]

H.-W. Lee, The study of mechanical arm and intelligent robot, IEEE Access 8 (2020) 119624-119634, http://dx.doi.org/10.1109/ACCESS.2020.3003807.

[22]

S. Li, H. Yu, W. Ding, H. Liu, L. Ye, C. Xia, X. Wang, X.-P. Zhang, Visual-tactile fusion for transparent object grasping in complex backgrounds, IEEE Trans. Robot. 39 (5) (2023) 3838-3856, http://dx.doi.org/10.1109/TRO.2023.3286071.

[23]

J. Hackett, M. Shah, Multi-sensor fusion: a perspective, in: Proceedings., IEEE International Conference on Robotics and Automation, vol.2, 1990, pp. 1324-1330, http://dx.doi.org/10.1109/ROBOT.1990.126184.

[24]

S. Cui, R. Wang, J. Wei, F. Li, S. Wang, Grasp state assessment of deformable objects using visual-tactile fusion perception, in: 2020 IEEE International Conference on Robotics and Automation, ICRA, 2020, pp. 538-544, http://dx.doi.org/10.1109/ICRA40945.2020.9196787.

[25]

S. Cui, R. Wang, J. Wei, J. Hu, S. Wang, Self-attention based visual-tactile fusion learning for predicting grasp outcomes, IEEE Robot. Autom. Lett. 5(4)(2020) 5827-5834, http://dx.doi.org/10.1109/LRA.2020.3010720.

[26]

Y. Han, K. Yu, R. Batra, N. Boyd, C. Mehta, T. Zhao, Y. She, S. Hutchinson, Y. Zhao, Learning generalizable vision-tactile robotic grasping strategy for deformable objects via transformer, IEEE/ASME Trans. Mechatronics (2024).

[27]

A. Depierre, E. Dellandréa, L. Chen, Scoring graspability based on grasp regression for better grasp prediction, in: 2021 IEEE International Con-ference on Robotics and Automation, ICRA, 2021, pp. 4370-4376, http://dx.doi.org/10.1109/ICRA48506.2021.9561198.

PDF (3378KB)

866

Accesses

0

Citation

Detail

Sections
Recommended

/