Automatic assessment of robotic suturing utilizing computer vision in a dry-lab simulation

Sarah Choksi , Sanjeev Narasimhan , Mattia Ballo , Mehmet Turkcan , Yiran Hu , Chengbo Zang , Alex Farrell , Brianna King , Jeffrey Nussbaum , Adin Reisner , Zoran Kostic , Giovanni Taffurelli , Filippo Filicori

Artificial Intelligence Surgery ›› 2025, Vol. 5 ›› Issue (2) : 160 -9.

PDF
Artificial Intelligence Surgery ›› 2025, Vol. 5 ›› Issue (2) :160 -9. DOI: 10.20517/ais.2024.84
Original Article

Automatic assessment of robotic suturing utilizing computer vision in a dry-lab simulation

Author information +
History +
PDF

Abstract

Aim: Automated surgical skill assessment is poised to become an invaluable asset in surgical residency training. In our study, we aimed to create deep learning (DL) computer vision artificial intelligence (AI) models capable of automatically assessing trainee performance and determining proficiency on robotic suturing tasks.

Methods: Participants performed two robotic suturing tasks on a bench-top model created by our lab. Videos were recorded of each surgeon performing a backhand suturing task and a railroad suturing task at 30 frames per second (FPS) and downsampled to 15 FPS for the study. Each video was segmented into four sub-stitch phases: needle positioning, targeting, driving, and withdrawal. Each sub-stitch was annotated with a binary technical score (ideal or non-ideal), reflecting the operator’s skill while performing the suturing action. For DL analysis, 16-frame overlapping clips were sampled from the videos with a stride of 1. To extract the features useful for classification, two pretrained Video Swin Transformer models were fine-tuned using these clips: one to classify the sub-stitch phase and another to predict the technical score. The model outputs were then combined and used to train a Random Forest Classifier to predict the surgeon's proficiency level.

Results: A total of 102 videos from 27 surgeons were evaluated using 3-fold cross-validation, 51 videos for the backhand suturing task and 51 videos for the railroad suturing task. Performance was assessed on sub-stitch classification accuracy, technical score accuracy, and surgeon proficiency prediction. The clip-based Video Swin Transformer models achieved an average classification accuracy of 70.23% for sub-stitch classification and 68.4% for technical score prediction on the test folds. Combining the model outputs, the Random Forest Classifier achieved an average accuracy of 66.7% in predicting surgeon proficiency.

Conclusion: This study shows the feasibility of creating a DL-based automatic assessment tool for robotic-assisted surgery. Using machine learning models, we predicted the proficiency level of a surgeon with 66.7% accuracy. Our dry lab model proposes a standardized training and assessment tool for suturing tasks using computer vision.

Keywords

Automatic surgical skill assessment / computer vision / surgical education / simulation

Cite this article

Download citation ▾
Sarah Choksi, Sanjeev Narasimhan, Mattia Ballo, Mehmet Turkcan, Yiran Hu, Chengbo Zang, Alex Farrell, Brianna King, Jeffrey Nussbaum, Adin Reisner, Zoran Kostic, Giovanni Taffurelli, Filippo Filicori. Automatic assessment of robotic suturing utilizing computer vision in a dry-lab simulation. Artificial Intelligence Surgery, 2025, 5(2): 160-9 DOI:10.20517/ais.2024.84

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Kitaguchi D,Matsuzaki H.Real-time automatic surgical phase recognition in laparoscopic sigmoidectomy using the convolutional neural network-based deep learning approach.Surg Endosc2020;34:4924-31

[2]

Cai T.Convolutional neural network-based surgical instrument detection.Technol Health Care2020;28:81-8 PMCID:PMC7369100

[3]

Luongo F,Nguyen JH,Hung AJ.Deep learning-based computer vision to recognize and classify suturing gestures in robot-assisted surgery.Surgery2021;169:1240-4 PMCID:PMC7994208

[4]

Choksi S,Zang C.Bringing artificial intelligence to the operating room: edge computing for real-time surgical phase recognition.Surg Endosc2023;37:8778-84

[5]

Birkmeyer JD,O’Reilly A.Surgical skill and complication rates after bariatric surgery.N Engl J Med2013;369:1434-42

[6]

Grewal B,Gerrah R.Characterization of surgical movements as a training tool for improving efficiency.J Surg Res2024;296:411-7

[7]

Azari DP,Quamme SRP.Modeling surgical technical skill using expert assessment for automated computer rating.Ann Surg2019;269:574-81 PMCID:PMC7412996

[8]

Welcome to FLS. Fundamentals of laparoscopic surgery. Available from: https://www.flsprogram.org/about-fls/. [Last accessed on 27 Mar 2025].

[9]

Fuchs HF,Babic B.Robotic-assisted minimally invasive esophagectomy (RAMIE) for esophageal cancer training curriculum-a worldwide Delphi consensus study.Dis Esophagus2022;35:doab055

[10]

Stegemann AP,Syed JR.Fundamental skills of robotic surgery: a multi-institutional randomized controlled trial for validation of a simulation-based curriculum.Urology2013;81:767-74

[11]

Satava RM,Levy JS.Proving the effectiveness of the fundamentals of robotic surgery (FRS) skills curriculum: a single-blinded, multispecialty, multi-institutional randomized control trial.Ann Surg2020;272:384-92

[12]

Ayoub-Charette S,Lee D.Rationale, design and participants baseline characteristics of a crossover randomized controlled trial of the effect of replacing SSBs with NSBs versus water on glucose tolerance, gut microbiome and cardiometabolic risk in overweight or obese adult SSB consumer: strategies to oppose SUGARS with non-nutritive sweeteners or water (STOP sugars NOW) trial and ectopic fat sub-study.Nutrients2023;15:1238 PMCID:PMC10005063

[13]

Lazar A,Laufer S.Automatic assessment of performance in the FLS trainer using computer vision.Surg Endosc2023;37:6476-82

[14]

Islam G,Li B,Patel VL.Affordable, web-based surgical skill training and evaluation tool.J Biomed Inform2016;59:102-14

[15]

Hung AJ,Sunmola IO,Nguyen JH.Capturing fine-grained details for video-based automation of suturing skills assessment.Int J Comput Assist Radiol Surg2023;18:545-52 PMCID:PMC9975072

[16]

Ma R,Laca JA.Artificial intelligence-based video feedback to improve novice performance on robotic suturing skills: a pilot study.J Endourol2024;38:884-91

[17]

Raza SJ,Jay C.Surgical competency for urethrovesical anastomosis during robot-assisted radical prostatectomy: development and validation of the robotic anastomosis competency evaluation.Urology2015;85:27-32

[18]

Otiato MX,Chu TN,Wagner C.Surgical gestures to evaluate apical dissection of robot-assisted radical prostatectomy.J Robot Surg2024;18:245 PMCID:PMC11161532

PDF

207

Accesses

0

Citation

Detail

Sections
Recommended

/