Development of a novel hand−eye calibration for intuitive control of minimally invasive surgical robot

Yanwen SUN, Bo PAN, Yili FU

PDF(7577 KB)
PDF(7577 KB)
Front. Mech. Eng. ›› 2022, Vol. 17 ›› Issue (3) : 42. DOI: 10.1007/s11465-022-0698-y
RESEARCH ARTICLE
RESEARCH ARTICLE

Development of a novel hand−eye calibration for intuitive control of minimally invasive surgical robot

Author information +
History +

Abstract

Robotic-assisted surgical system has introduced a powerful platform through dexterous instrument and hand−eye coordination intuitive control. The knowledge of laparoscopic vision is a crucial piece of information for robot-assisted minimally invasive surgery focusing on improved surgical outcomes. Obtaining the transformation with respect to the laparoscope and robot slave arm frames using hand−eye calibration is essential, which is a key component for developing intuitive control algorithm. We proposed a novel two-step modified dual quaternion for hand−eye calibration in this study. The dual quaternion was exploited to solve the hand−eye calibration simultaneously and powered by an iteratively separate solution. The obtained hand−eye calibration result was applied to the intuitive control by using the hand−eye coordination criterion. Promising simulations and experimental studies were conducted to evaluate the proposed method on our surgical robot system. We extensively compared the proposed method with state-of-the-art methods. Results demonstrate this method can improve the calibration accuracy. The effectiveness of the intuitive control algorithm was quantitatively evaluated, and an improved hand−eye calibration method was developed. The relationship between laparoscope and robot kinematics can be established for intuitive control.

Graphical abstract

Keywords

minimally invasive surgery / hand−eye calibration / intuitive control / surgical robot / dual quaternion

Cite this article

Download citation ▾
Yanwen SUN, Bo PAN, Yili FU. Development of a novel hand−eye calibration for intuitive control of minimally invasive surgical robot. Front. Mech. Eng., 2022, 17(3): 42 https://doi.org/10.1007/s11465-022-0698-y

References

[1]
Zhong F X, Wang Z R, Chen W, He K J, Wang Y Q, Liu Y H. Hand-eye calibration of surgical instrument for robotic surgery using interactive manipulation. IEEE Robotics and Automation Letters, 2020, 5(2): 1540–1547
CrossRef Google scholar
[2]
Gao Y Q, Wang S X, Li J M, Li A M, Liu H B, Xing Y. Modeling and evaluation of hand‒eye coordination of surgical robotic system on task performance. The International Journal of Medical Robotics and Computer Assisted Surgery, 2017, 13(4): e1829
CrossRef Google scholar
[3]
Su H, Hu Y B, Karimi H R, Knoll A, Ferrigno G, Momi E D. Improved recurrent neural network-based manipulator control with remote center of motion constraints: experimental results. Neural Networks, 2020, 131: 291–299
CrossRef Google scholar
[4]
Zhang W, Li H Y, Cui L L, Li H Y, Zhang X Y, Fang S X, Zhang Q J. Research progress and development trend of surgical robot and surgical instrument arm. The International Journal of Medical Robotics and Computer Assisted Surgery, 2021, 17(5): e2309
CrossRef Google scholar
[5]
Zhang Z Q, Zhang L, Yang G Z. A computationally efficient method for hand–eye calibration. International Journal of Computer Assisted Radiology and Surgery, 2017, 12(10): 1775–1787
CrossRef Google scholar
[6]
Su H, Qi W, Yang C G, Sandoval J, Ferrigno G, Momi E D. Deep neural network approach in robot tool dynamics identification for bilateral teleoperation. IEEE Robotics and Automation Letters, 2020, 5(2): 2943–2949
CrossRef Google scholar
[7]
Wang Z Y, Zi B, Ding H F, You W, Yu L T. Hybrid grey prediction model-based autotracking algorithm for the laparoscopic visual window of surgical robot. Mechanism and Machine Theory, 2018, 123: 107–123
CrossRef Google scholar
[8]
Allan M, Ourselin S, Hawkes D J, Kelly J D, Stoyanov D. 3-D pose estimation of articulated instruments in robotic minimally invasive surgery. IEEE Transactions on Medical Imaging, 2018, 37(5): 1204–1213
CrossRef Google scholar
[9]
Kassahun Y, Yu B B, Tibebu A T, Stoyanov D, Giannarou S, Metzen J H, Vander Poorten E. Surgical robotics beyond enhanced dexterity instrumentation: a survey of machine learning techniques and their role in intelligent and autonomous surgical actions. International Journal of Computer Assisted Radiology and Surgery, 2016, 11(4): 553–568
CrossRef Google scholar
[10]
Du X F, Kurmann T, Chang P L, Allan M, Ourselin S, Sznitman R, Kelly J D, Stoyanov D. Articulated multi-instrument 2-D pose estimation using fully convolutional networks. IEEE Transactions on Medical Imaging, 2018, 37(5): 1276–1287
CrossRef Google scholar
[11]
Wang Z R, Liu Z W, Ma Q L, Cheng A, Liu Y H, Kim S, Deguet A, Reiter A, Kazanzides P, Taylor R H. Vision-based calibration of dual RCM-based robot arms in human‒robot collaborative minimally invasive surgery. IEEE Robotics and Automation Letters, 2018, 3(2): 672–679
CrossRef Google scholar
[12]
Tsai R Y, Lenz R K. A new technique for fully autonomous and efficient 3D robotics hand/eye calibration. IEEE Transactions on Robotics and Automation, 1989, 5(3): 345–358
CrossRef Google scholar
[13]
Shiu Y C, Ahmad S. Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX = XB. IEEE Transactions on Robotics and Automation, 1989, 5(1): 16–29
CrossRef Google scholar
[14]
Chou J C K, Kamel M. Finding the position and orientation of a sensor on a robot manipulator using quaternions. The International Journal of Robotics Research, 1991, 10(3): 240–254
CrossRef Google scholar
[15]
Horaud R, Dornaika F. Hand‒eye calibration. The International Journal of Robotics Research, 1995, 14(3): 195–210
CrossRef Google scholar
[16]
Park F C, Martin B J. Robot sensor calibration: solving AX = XB on the Euclidean group. IEEE Transactions on Robotics and Automation, 1994, 10(5): 717–721
CrossRef Google scholar
[17]
Daniilidis K. Hand‒eye calibration using dual quaternions. The International Journal of Robotics Research, 1999, 18(3): 286–298
CrossRef Google scholar
[18]
Lu Y C, Chou J C K. Eight-space quaternion approach for robotic hand‒eye calibration. In: Proceedings of 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century. Vancouver: IEEE, 1995, 3316–3321
CrossRef Google scholar
[19]
Zhao Z J, Liu Y C. A hand‒eye calibration algorithm based on screw motions. Robotica, 2009, 27(2): 217–223
CrossRef Google scholar
[20]
Li W, Dong M L, Lu N G, Lou X P, Sun P. Simultaneous robot–world and hand–eye calibration without a calibration object. Sensors, 2018, 18(11): 3949
CrossRef Google scholar
[21]
Andreff N, Horaud R, Espiau B. On-line hand–eye calibration. In: Proceedings of the Second International Conference on 3-D Digital Imaging and Modeling. Ottawa: IEEE, 1999, 430–436
CrossRef Google scholar
[22]
Pachtrachai K, Vasconcelos F, Dwyer G, Hailes S, Stoyanov D. Hand‒eye calibration with a remote centre of motion. IEEE Robotics and Automation Letters, 2019, 4(4): 3121–3128
CrossRef Google scholar
[23]
Mao J F, Huang X P, Jiang L. A flexible solution to AX = XB for robot hand‒eye calibration. In: Proceedings of the 10th WSEAS International Conference on Robotics, Control and Manufacturing Technology. Hangzhou: World Scientific and Engineering Academy and Society (WSEAS), 2010, 118–122
CrossRef Google scholar
[24]
Schmidt J, Vogt F, Niemann H. Robust hand–eye calibration of an endoscopic surgery robot using dual quaternions. In: Michaelis B, Krell G, eds. Pattern Recognition. DAGM 2003. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer, 2003, 548–556
CrossRef Google scholar
[25]
Zhao Z J. Hand‒eye calibration using convex optimization. In: Proceedings of 2011 IEEE International Conference on Robotics and Automation (ICRA). Shanghai: IEEE, 2011, 2947–2952
CrossRef Google scholar
[26]
Enebuse I, Foo M, Ibrahim B S K K, Ahmed H, Supmak F, Eyobu O S. A comparative review of hand‒eye calibration techniques for vision guided robots. IEEE Access, 2021, 9: 113143–113155
CrossRef Google scholar
[27]
Pachtrachai K, Vasconcelos F, Edwards P, Stoyanov D. Learning to calibrate—estimating the hand‒eye transformation without calibration objects. IEEE Robotics and Automation Letters, 2021, 6(4): 7309–7316
CrossRef Google scholar
[28]
Pachtrachai K, Allan M, Pawar V, Hailes S, Stoyanov D. Hand‒eye calibration for robotic assisted minimally invasive surgery without a calibration object. In: Proceedings of 2016 IEEE/ RSJ International Conference on Intelligent Robots and Systems (IROS). Daejeon: IEEE, 2016, 2485–2491
CrossRef Google scholar
[29]
Thompson S, Stoyanov D, Schneider C, Gurusamy K, Ourselin S, Davidson B, Hawkes D, Clarkson M J. Hand–eye calibration for rigid laparoscopes using an invariant point. International Journal of Computer Assisted Radiology and Surgery, 2016, 11(6): 1071–1080
CrossRef Google scholar
[30]
Pachtrachai K, Vasconcelos F, Chadebecq F, Allan M, Hailes S, Pawar V, Stoyanov D. Adjoint transformation algorithm for hand–eye calibration with applications in robotic assisted surgery. Annals of Biomedical Engineering, 2018, 46(10): 1606–1620
CrossRef Google scholar
[31]
Su H, Li S, Manivannan J, Bascetta L, Ferrigno G, Momi E D. Manipulability optimization control of a serial redundant robot for robot-assisted minimally invasive surgery. In: Proceedings of 2019 International Conference on Robotics and Automation (ICRA). Montreal: IEEE, 2019, 1323–1328
CrossRef Google scholar
[32]
Morgan I, Jayarathne U, Rankin A, Peters T M, Chen E C S. Hand‒eye calibration for surgical cameras: a procrustean perspective-n-point solution. International Journal of Computer Assisted Radiology and Surgery, 2017, 12(7): 1141–1149
CrossRef Google scholar
[33]
Malti A, Barreto J P. Hand–eye and radial distortion calibration for rigid endoscopes. The International Journal of Medical Robotics and Computer Assisted Surgery, 2013, 9(4): 441–454
CrossRef Google scholar
[34]
Peng J Q, Xu W F, Wang F X, Han Y, Liang B. A hybrid hand–eye calibration method for multilink cable-driven hyper-redundant manipulators. IEEE Transactions on Instrumentation and Measurement, 2021, 70: 1–13
CrossRef Google scholar
[35]
Niu G J, Pan B, Ai Y, Fu Y L. Intuitive control algorithm of a novel minimally invasive surgical robot. Computer Assisted Surgery, 2016, 21(sup1): 92–101
CrossRef Google scholar
[36]
Niu G J, Pan B, Fu Y L, Qu C C. Development of a new medical robot system for minimally invasive surgery. IEEE Access, 2020, 8: 144136–144155
CrossRef Google scholar
[37]
Zuo S Y, Wang Z, Zhang T C, Chen B J. A novel master–slave intraocular surgical robot with force feedback. The International Journal of Medical Robotics and Computer Assisted Surgery, 2021, 17(4): e2267
CrossRef Google scholar
[38]
Dimitrakakis E, Lindenroth L, Dwyer G, Aylmore H, Dorward N L, Marcus H J, Stoyanov D. An intuitive surgical handle design for robotic neurosurgery. International Journal of Computer Assisted Radiology and Surgery, 2021, 16(7): 1131–1139
CrossRef Google scholar
[39]
Zhang Z. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330–1334
CrossRef Google scholar

Nomenclature

A Matrix of laparoscope motion
B Matrix of robot motion
K Matrix for solving hand‒eye equation
L Modified matrix for hand‒eye equation solution
M Matrix vector
N Number of experimental test data groups
PBL Robot laparoscope slave arm base frame
PE Robot laparoscope slave arm end-effector frame
PL Laparoscope frame
PO Calibration object frame
in sbase P Translation vector from surgical robot slave instrument arm base frame to the instrument frame
inseye Pi Position of surgical instrument in the surgeon eye frame in time step i
insimg Pi Instrument position with respective with the laparoscope camera image frame
h an deye Pi Position of surgeon hand in the surgeon eye frame in time step i
h an dmas P Translation vector from master system base frame to the surgeon hand frame
qA, qX, qB Dual quaternion representations of Eq. (2)
qAR, qXR, qBR Rotation components
qAR0, qBR0 Components of the qA R and qB R, respectively
qAt, qXt, qBt Translation components
b as elap R Rotation matrix component of baselap T
h an dmas R Rotation matrix component of homogeneous transformation matrix from master system base frame to the surgeon hand frame
in sbase R Rotation matrix component of homogeneous transformation matrix from surgical robot slave instrument arm base frame to the instrument frame
lapimg R Rotation matrix component of homogeneous transformation matrix from laparoscope camera image frame to robot laparoscope slave arm end tool frame
m aseye R Rotation matrix component of maseye T
tA, tB, tX Translation component of the homogeneous transformation matrix in AX = XB
tGT Ground-truth value
b as elap T Transformation matrix from robot laparoscope end tool frame to robot instrument slave arm base frame
ELT Transformation between P L and P E
h an deye T Homogeneous transformation from surgeon eye frame to surgeon hand frame
h an dmas T Homogeneous transformation matrix from master system base frame to the surgeon hand frame
i mgeye T Transformation matrix from surgeon eye frame to the laparoscope camera image frame
in sbase T Homogeneous transformation matrix from surgical robot slave instrument arm base frame to the instrument frame
inseye T Homogeneous transformation from surgeon eye frame to instrument frame
insimg T Homogeneous transformation from laparoscope camera image frame to instrument frame
l apimg T Transformation matrix from laparoscope camera image frame to robot laparoscope slave arm end tool frame
m aseye T Homogeneous transformation from surgeon eye frame to master system base frame
B LET(i) Motion i of transformation between robot laparoscope slave arm base frame P BL and the robot laparoscope slave arm end-effector frame P E
OLT(i) Motion i of transformation between calibration object frame P O and the laparoscope frame P L
X Target homogeneous transformation matrix
ωA, ωB, ωX Rotation component of the homogeneous transformation matrix in AX = XB
ω GT Ground-truth value
γ1, γ 2 Parameters for dual quaternion solution
τ, υ Parameters for dual quaternion solution of Eq. (9)
κ Defined ratio variable for solving equation
αA, βA, αB, βB Lie group members of the transformation
λ Master–slave position mapping scale factor
Δ r Rotation matrix error
Δ t Translation vector error
θt Translation error between calculated value and ground-truth value
θω Rotation error between calculated value and ground-truth value

Acknowledgements

The authors declare that they have no competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. This study was supported by the State Key Laboratory of Robotics and Systems, China (Grant No. SKLRS202009B).

RIGHTS & PERMISSIONS

2022 Higher Education Press
AI Summary AI Mindmap
PDF(7577 KB)

Accesses

Citations

Detail

Sections
Recommended

/