Development of a novel hand−eye calibration for intuitive control of minimally invasive surgical robot

Frontiers of Mechanical Engineering ›› 2022, Vol. 17 ›› Issue (3) : 42.

PDF(7577 KB)
PDF(7577 KB)
Frontiers of Mechanical Engineering ›› 2022, Vol. 17 ›› Issue (3) : 42. DOI: 10.1007/s11465-022-0698-y
Mechanisms and Robotics - RESEARCH ARTICLE

作者信息 +

Development of a novel hand−eye calibration for intuitive control of minimally invasive surgical robot

Author information +
History +

Abstract

Robotic-assisted surgical system has introduced a powerful platform through dexterous instrument and hand−eye coordination intuitive control. The knowledge of laparoscopic vision is a crucial piece of information for robot-assisted minimally invasive surgery focusing on improved surgical outcomes. Obtaining the transformation with respect to the laparoscope and robot slave arm frames using hand−eye calibration is essential, which is a key component for developing intuitive control algorithm. We proposed a novel two-step modified dual quaternion for hand−eye calibration in this study. The dual quaternion was exploited to solve the hand−eye calibration simultaneously and powered by an iteratively separate solution. The obtained hand−eye calibration result was applied to the intuitive control by using the hand−eye coordination criterion. Promising simulations and experimental studies were conducted to evaluate the proposed method on our surgical robot system. We extensively compared the proposed method with state-of-the-art methods. Results demonstrate this method can improve the calibration accuracy. The effectiveness of the intuitive control algorithm was quantitatively evaluated, and an improved hand−eye calibration method was developed. The relationship between laparoscope and robot kinematics can be established for intuitive control.

Keywords

minimally invasive surgery / hand−eye calibration / intuitive control / surgical robot / dual quaternion

引用本文

导出引用
. . Frontiers of Mechanical Engineering. 2022, 17(3): 42 https://doi.org/10.1007/s11465-022-0698-y

参考文献

[1]
Zhong F X, Wang Z R, Chen W, He K J, Wang Y Q, Liu Y H. Hand-eye calibration of surgical instrument for robotic surgery using interactive manipulation. IEEE Robotics and Automation Letters, 2020, 5(2): 1540–1547
CrossRef ADS Google scholar
[2]
Gao Y Q, Wang S X, Li J M, Li A M, Liu H B, Xing Y. Modeling and evaluation of hand‒eye coordination of surgical robotic system on task performance. The International Journal of Medical Robotics and Computer Assisted Surgery, 2017, 13(4): e1829
CrossRef ADS Google scholar
[3]
Su H, Hu Y B, Karimi H R, Knoll A, Ferrigno G, Momi E D. Improved recurrent neural network-based manipulator control with remote center of motion constraints: experimental results. Neural Networks, 2020, 131: 291–299
CrossRef ADS Google scholar
[4]
Zhang W, Li H Y, Cui L L, Li H Y, Zhang X Y, Fang S X, Zhang Q J. Research progress and development trend of surgical robot and surgical instrument arm. The International Journal of Medical Robotics and Computer Assisted Surgery, 2021, 17(5): e2309
CrossRef ADS Google scholar
[5]
Zhang Z Q, Zhang L, Yang G Z. A computationally efficient method for hand–eye calibration. International Journal of Computer Assisted Radiology and Surgery, 2017, 12(10): 1775–1787
CrossRef ADS Google scholar
[6]
Su H, Qi W, Yang C G, Sandoval J, Ferrigno G, Momi E D. Deep neural network approach in robot tool dynamics identification for bilateral teleoperation. IEEE Robotics and Automation Letters, 2020, 5(2): 2943–2949
CrossRef ADS Google scholar
[7]
Wang Z Y, Zi B, Ding H F, You W, Yu L T. Hybrid grey prediction model-based autotracking algorithm for the laparoscopic visual window of surgical robot. Mechanism and Machine Theory, 2018, 123: 107–123
CrossRef ADS Google scholar
[8]
Allan M, Ourselin S, Hawkes D J, Kelly J D, Stoyanov D. 3-D pose estimation of articulated instruments in robotic minimally invasive surgery. IEEE Transactions on Medical Imaging, 2018, 37(5): 1204–1213
CrossRef ADS Google scholar
[9]
Kassahun Y, Yu B B, Tibebu A T, Stoyanov D, Giannarou S, Metzen J H, Vander Poorten E. Surgical robotics beyond enhanced dexterity instrumentation: a survey of machine learning techniques and their role in intelligent and autonomous surgical actions. International Journal of Computer Assisted Radiology and Surgery, 2016, 11(4): 553–568
CrossRef ADS Google scholar
[10]
Du X F, Kurmann T, Chang P L, Allan M, Ourselin S, Sznitman R, Kelly J D, Stoyanov D. Articulated multi-instrument 2-D pose estimation using fully convolutional networks. IEEE Transactions on Medical Imaging, 2018, 37(5): 1276–1287
CrossRef ADS Google scholar
[11]
Wang Z R, Liu Z W, Ma Q L, Cheng A, Liu Y H, Kim S, Deguet A, Reiter A, Kazanzides P, Taylor R H. Vision-based calibration of dual RCM-based robot arms in human‒robot collaborative minimally invasive surgery. IEEE Robotics and Automation Letters, 2018, 3(2): 672–679
CrossRef ADS Google scholar
[12]
Tsai R Y, Lenz R K. A new technique for fully autonomous and efficient 3D robotics hand/eye calibration. IEEE Transactions on Robotics and Automation, 1989, 5(3): 345–358
CrossRef ADS Google scholar
[13]
Shiu Y C, Ahmad S. Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX = XB. IEEE Transactions on Robotics and Automation, 1989, 5(1): 16–29
CrossRef ADS Google scholar
[14]
Chou J C K, Kamel M. Finding the position and orientation of a sensor on a robot manipulator using quaternions. The International Journal of Robotics Research, 1991, 10(3): 240–254
CrossRef ADS Google scholar
[15]
Horaud R, Dornaika F. Hand‒eye calibration. The International Journal of Robotics Research, 1995, 14(3): 195–210
CrossRef ADS Google scholar
[16]
Park F C, Martin B J. Robot sensor calibration: solving AX = XB on the Euclidean group. IEEE Transactions on Robotics and Automation, 1994, 10(5): 717–721
CrossRef ADS Google scholar
[17]
Daniilidis K. Hand‒eye calibration using dual quaternions. The International Journal of Robotics Research, 1999, 18(3): 286–298
CrossRef ADS Google scholar
[18]
Lu Y C, Chou J C K. Eight-space quaternion approach for robotic hand‒eye calibration. In: Proceedings of 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century. Vancouver: IEEE, 1995, 3316–3321
CrossRef ADS Google scholar
[19]
Zhao Z J, Liu Y C. A hand‒eye calibration algorithm based on screw motions. Robotica, 2009, 27(2): 217–223
CrossRef ADS Google scholar
[20]
Li W, Dong M L, Lu N G, Lou X P, Sun P. Simultaneous robot–world and hand–eye calibration without a calibration object. Sensors, 2018, 18(11): 3949
CrossRef ADS Google scholar
[21]
Andreff N, Horaud R, Espiau B. On-line hand–eye calibration. In: Proceedings of the Second International Conference on 3-D Digital Imaging and Modeling. Ottawa: IEEE, 1999, 430–436
CrossRef ADS Google scholar
[22]
Pachtrachai K, Vasconcelos F, Dwyer G, Hailes S, Stoyanov D. Hand‒eye calibration with a remote centre of motion. IEEE Robotics and Automation Letters, 2019, 4(4): 3121–3128
CrossRef ADS Google scholar
[23]
Mao J F, Huang X P, Jiang L. A flexible solution to AX = XB for robot hand‒eye calibration. In: Proceedings of the 10th WSEAS International Conference on Robotics, Control and Manufacturing Technology. Hangzhou: World Scientific and Engineering Academy and Society (WSEAS), 2010, 118–122
CrossRef ADS Google scholar
[24]
Schmidt J, Vogt F, Niemann H. Robust hand–eye calibration of an endoscopic surgery robot using dual quaternions. In: Michaelis B, Krell G, eds. Pattern Recognition. DAGM 2003. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer, 2003, 548–556
CrossRef ADS Google scholar
[25]
Zhao Z J. Hand‒eye calibration using convex optimization. In: Proceedings of 2011 IEEE International Conference on Robotics and Automation (ICRA). Shanghai: IEEE, 2011, 2947–2952
CrossRef ADS Google scholar
[26]
Enebuse I, Foo M, Ibrahim B S K K, Ahmed H, Supmak F, Eyobu O S. A comparative review of hand‒eye calibration techniques for vision guided robots. IEEE Access, 2021, 9: 113143–113155
CrossRef ADS Google scholar
[27]
Pachtrachai K, Vasconcelos F, Edwards P, Stoyanov D. Learning to calibrate—estimating the hand‒eye transformation without calibration objects. IEEE Robotics and Automation Letters, 2021, 6(4): 7309–7316
CrossRef ADS Google scholar
[28]
Pachtrachai K, Allan M, Pawar V, Hailes S, Stoyanov D. Hand‒eye calibration for robotic assisted minimally invasive surgery without a calibration object. In: Proceedings of 2016 IEEE/ RSJ International Conference on Intelligent Robots and Systems (IROS). Daejeon: IEEE, 2016, 2485–2491
CrossRef ADS Google scholar
[29]
Thompson S, Stoyanov D, Schneider C, Gurusamy K, Ourselin S, Davidson B, Hawkes D, Clarkson M J. Hand–eye calibration for rigid laparoscopes using an invariant point. International Journal of Computer Assisted Radiology and Surgery, 2016, 11(6): 1071–1080
CrossRef ADS Google scholar
[30]
Pachtrachai K, Vasconcelos F, Chadebecq F, Allan M, Hailes S, Pawar V, Stoyanov D. Adjoint transformation algorithm for hand–eye calibration with applications in robotic assisted surgery. Annals of Biomedical Engineering, 2018, 46(10): 1606–1620
CrossRef ADS Google scholar
[31]
Su H, Li S, Manivannan J, Bascetta L, Ferrigno G, Momi E D. Manipulability optimization control of a serial redundant robot for robot-assisted minimally invasive surgery. In: Proceedings of 2019 International Conference on Robotics and Automation (ICRA). Montreal: IEEE, 2019, 1323–1328
CrossRef ADS Google scholar
[32]
Morgan I, Jayarathne U, Rankin A, Peters T M, Chen E C S. Hand‒eye calibration for surgical cameras: a procrustean perspective-n-point solution. International Journal of Computer Assisted Radiology and Surgery, 2017, 12(7): 1141–1149
CrossRef ADS Google scholar
[33]
Malti A, Barreto J P. Hand–eye and radial distortion calibration for rigid endoscopes. The International Journal of Medical Robotics and Computer Assisted Surgery, 2013, 9(4): 441–454
CrossRef ADS Google scholar
[34]
Peng J Q, Xu W F, Wang F X, Han Y, Liang B. A hybrid hand–eye calibration method for multilink cable-driven hyper-redundant manipulators. IEEE Transactions on Instrumentation and Measurement, 2021, 70: 1–13
CrossRef ADS Google scholar
[35]
Niu G J, Pan B, Ai Y, Fu Y L. Intuitive control algorithm of a novel minimally invasive surgical robot. Computer Assisted Surgery, 2016, 21(sup1): 92–101
CrossRef ADS Google scholar
[36]
Niu G J, Pan B, Fu Y L, Qu C C. Development of a new medical robot system for minimally invasive surgery. IEEE Access, 2020, 8: 144136–144155
CrossRef ADS Google scholar
[37]
Zuo S Y, Wang Z, Zhang T C, Chen B J. A novel master–slave intraocular surgical robot with force feedback. The International Journal of Medical Robotics and Computer Assisted Surgery, 2021, 17(4): e2267
CrossRef ADS Google scholar
[38]
Dimitrakakis E, Lindenroth L, Dwyer G, Aylmore H, Dorward N L, Marcus H J, Stoyanov D. An intuitive surgical handle design for robotic neurosurgery. International Journal of Computer Assisted Radiology and Surgery, 2021, 16(7): 1131–1139
CrossRef ADS Google scholar
[39]
Zhang Z. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330–1334
CrossRef ADS Google scholar

Nomenclature

A Matrix of laparoscope motion
B Matrix of robot motion
K Matrix for solving hand‒eye equation
L Modified matrix for hand‒eye equation solution
M Matrix vector
N Number of experimental test data groups
PBL Robot laparoscope slave arm base frame
PE Robot laparoscope slave arm end-effector frame
PL Laparoscope frame
PO Calibration object frame
insbaseP Translation vector from surgical robot slave instrument arm base frame to the instrument frame
inseyePi Position of surgical instrument in the surgeon eye frame in time step i
insimgPi Instrument position with respective with the laparoscope camera image frame
handeyePi Position of surgeon hand in the surgeon eye frame in time step i
handmasP Translation vector from master system base frame to the surgeon hand frame
qA, qX, qB Dual quaternion representations of Eq. (2)
qAR, qXR, qBR Rotation components
qAR0, qBR0 Components of the qAR and qBR, respectively
qAt, qXt, qBt Translation components
baselapR Rotation matrix component of baselapT
handmasR Rotation matrix component of homogeneous transformation matrix from master system base frame to the surgeon hand frame
insbaseR Rotation matrix component of homogeneous transformation matrix from surgical robot slave instrument arm base frame to the instrument frame
lapimgR Rotation matrix component of homogeneous transformation matrix from laparoscope camera image frame to robot laparoscope slave arm end tool frame
maseyeR Rotation matrix component of maseyeT
tA, tB, tX Translation component of the homogeneous transformation matrix in AX = XB
tGT Ground-truth value
baselapT Transformation matrix from robot laparoscope end tool frame to robot instrument slave arm base frame
ELT Transformation between PL and PE
handeyeT Homogeneous transformation from surgeon eye frame to surgeon hand frame
handmasT Homogeneous transformation matrix from master system base frame to the surgeon hand frame
imgeyeT Transformation matrix from surgeon eye frame to the laparoscope camera image frame
insbaseT Homogeneous transformation matrix from surgical robot slave instrument arm base frame to the instrument frame
inseyeT Homogeneous transformation from surgeon eye frame to instrument frame
insimgT Homogeneous transformation from laparoscope camera image frame to instrument frame
lapimgT Transformation matrix from laparoscope camera image frame to robot laparoscope slave arm end tool frame
maseyeT Homogeneous transformation from surgeon eye frame to master system base frame
BLET(i) Motion i of transformation between robot laparoscope slave arm base frame PBL and the robot laparoscope slave arm end-effector frame PE
OLT(i) Motion i of transformation between calibration object frame PO and the laparoscope frame PL
X Target homogeneous transformation matrix
ωA, ωB, ωX Rotation component of the homogeneous transformation matrix in AX = XB
ωGT Ground-truth value
γ1, γ2 Parameters for dual quaternion solution
τ, υ Parameters for dual quaternion solution of Eq. (9)
κ Defined ratio variable for solving equation
αA, βA, αB, βB Lie group members of the transformation
λ Master–slave position mapping scale factor
Δr Rotation matrix error
Δt Translation vector error
θt Translation error between calculated value and ground-truth value
θω Rotation error between calculated value and ground-truth value

Acknowledgements

The authors declare that they have no competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. This study was supported by the State Key Laboratory of Robotics and Systems, China (Grant No. SKLRS202009B).

版权

2022 Higher Education Press
PDF(7577 KB)

专题

Mechanisms and Robotics

1418

Accesses

0

Citation

Detail

段落导航
相关文章

/