AR-enhanced digital twin for human–robot interaction in manufacturing systems

Zhongyuan Liao , Yi Cai

Energy, Ecology and Environment ›› : 1 -19.

PDF
Energy, Ecology and Environment ›› : 1 -19. DOI: 10.1007/s40974-024-00327-7
Original Article

AR-enhanced digital twin for human–robot interaction in manufacturing systems

Author information +
History +
PDF

Abstract

The integration of advanced technologies into manufacturing processes is critical for addressing the complexities of modern industrial environments. In particular, the realm of human–robot interaction (HRI) faces the challenge of ensuring that human operators can effectively collaborate with increasingly sophisticated robotic systems. Traditional interfaces often fall short of providing the intuitive, real-time interaction necessary for optimal performance and safety. To address this issue, we introduce a novel system that combines digital twin (DT) technology with augmented reality (AR) to enhance HRI in manufacturing settings. The proposed AR-based DT system creates a dynamic virtual model of robot operations, offering an immersive interface that overlays crucial information onto the user’s field of vision. This approach aims to bridge the gap between human operators and robotic systems, improving spatial awareness, task guidance, and decision-making processes. Our system is designed to operate at three distinct levels of DT functionality: the virtual twin for in-situ monitoring, the hybrid twin for intuitive interaction, and the cognitive twin for optimized operation. By leveraging these levels, the system provides a comprehensive solution that ranges from basic visualization to advanced predictive analytics. The effectiveness of the AR-based DT system is demonstrated through a human-centric user study conducted in manufacturing scenarios. The results show a significant reduction in operational time and errors, alongside an enhancement of the overall user experience. These findings confirm the potential of our system to transform HRI by providing a safer, more efficient, and more adaptable manufacturing environment. Our research contributes to the advancement of smart manufacturing by evidencing the synergistic benefits of integrating DT and AR into HRI.

Keywords

Digital twin / Intuitive interface / Augmented reality / Human–robot interaction / Human-centricity

Cite this article

Download citation ▾
Zhongyuan Liao, Yi Cai. AR-enhanced digital twin for human–robot interaction in manufacturing systems. Energy, Ecology and Environment 1-19 DOI:10.1007/s40974-024-00327-7

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Amtsberg F, Yang X, Skoury L et al (2021) iHRC: an AR-based interface for intuitive, interactive and coordinated task sharing between humans and robots in building construction. In: ISARC. Proceedings of the international symposium on automation and robotics in construction. IAARC Publications, pp 25–32

[2]

Arevalo Arboleda S, Rücker F, Dierks T et al (2021) Assisting manipulation and grasping in robot teleoperation with augmented reality visual cues. In: Proceedings of the 2021 CHI conference on human factors in computing systems, pp 1–14

[3]

Baicun W, Yuan X, Jianlin Y Human-centered intelligent manufacturing: overview and perspectives. Strateg Study CAE, 2020, 22(04): 139

[4]

Baroroh DK, Chu CH, Wang L. Systematic literature review on augmented reality in smart manufacturing: collaboration between human and computational intelligence. J Manuf Syst, 2021, 61: 696-711

[5]

Bhatt PM, Malhan RK, Shembekar AV Expanding capabilities of additive manufacturing through use of robotics technologies: a survey. Addit Manuf, 2020, 31: 100933

[6]

Breque M, De Nul L, Petridis A. Industry 5.0: towards a sustainable, human-centric and resilient European industry, 2021 Luxembourg European Commission, Directorate-General for Research and Innovation

[7]

Brooke J. SUS—a quick and dirty usability scale. Usab Eval Ind, 1996, 189(194): 4-7

[8]

Cai Y, Wang Y, Burnett M. Using augmented reality to build digital twin for reconfigurable additive manufacturing system. J Manuf Syst, 2020, 56: 598-604

[9]

Cao Y, Xu Z, Li F et al (2019) V. ra: an in-situ visual authoring system for robot-IoT task planning with augmented reality. In: Proceedings of the 2019 on designing interactive systems conference, pp 1059–1070

[10]

Cauchard JR, Tamkin A, Wang CY et al (2019) Drone. io: a gestural and visual interface for human-drone interaction. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 153–162

[11]

Chen L, Takashima K, Fujita K et al (2021) Pinpointfly: an egocentric position-control drone interface using mobile ar. In: Proceedings of the 2021 CHI conference on human factors in computing systems. Association for Computing Machinery, New York. https://doi.org/10.1145/3411764.3445110

[12]

Choi S, Cai Y. A virtual prototyping system with reconfigurable actuators for multi-material layered manufacturing. Comput Ind, 2014, 65(1): 37-49

[13]

Dimitropoulos N, Togias T, Zacharaki N Seamless human–robot collaborative assembly using artificial intelligence and wearable devices. Appl Sci, 2021, 11(12): 5699

[14]

Eswaran M, Bahubalendruni MR. Challenges and opportunities on AR/VR technologies for manufacturing systems in the context of industry 4.0: a state of the art review. J Manuf Syst, 2022, 65: 260-278

[15]

Fuste A, Reynolds B, Hobin J et al (2020) Kinetic ar: a framework for robotic motion systems in spatial computing. In: Extended abstracts of the 2020 CHI conference on human factors in computing systems. Association for Computing Machinery, New York, pp. 1–8. https://doi.org/10.1145/3334480.3382814,

[16]

Grieves M, Vickers J (2017) Digital twin: mitigating unpredictable, undesirable emergent behavior in complex systems. Transdisciplinary perspectives on complex systems: new findings and approaches, pp 85–113

[17]

Hart SG, Staveland LE. Development of NASA-TLX (task load index): results of empirical and theoretical research. Adv Psychol, 1988, 52: 139-183

[18]

Hassenzahl M, Burmester M, Koller F (2003) Attrakdiff: Ein fragebogen zur messung wahrgenommener hedonischer und pragmatischer qualität. Mensch & Computer 2003: Interaktion in Bewegung, pp 187–196

[19]

Hedayati H, Walker M, Szafir D (2018) Improving collocated robot teleoperation with augmented reality. In: Proceedings of the 2018 ACM/IEEE international conference on human-robot interaction, pp 78–86

[20]

Hoang KC, Chan WP, Lay S et al (2022) Virtual barriers in augmented reality for safe and effective human-robot cooperation in manufacturing. In: 2022 31st IEEE international conference on robot and human interactive communication (RO-MAN). IEEE, pp 1174–1180

[21]

Huang S, Wang B, Li X Industry 5.0 and society 5.0-comparison, complementation and co-evolution. J Manuf Syst, 2022, 64: 424-428

[22]

ISO I (2020) Dis 23247-1 automation systems and integration-digital twin framework for manufacturing. International Organization for Standardization, Geneva

[23]

Kuts V, Otto T, Tähemaa T et al (2018) Adaptive industrial robots using machine vision. In: ASME international mechanical engineering congress and exposition. American Society of Mechanical Engineers, p V002T02A093

[24]

Leng J, Zhu X, Huang Z et al (2023) Manuchain II: blockchained smart contract system as the digital twin of decentralized autonomous manufacturing toward resilience in industry 5.0. IEEE Trans Syst Man Cybern: Syst

[25]

Li C, Zheng P, Yin Y An AR-assisted Deep Reinforcement Learning-based approach towards mutual-cognitive safe human-robot interaction. Robot Comput-Integr Manuf, 2023, 80: 102471

[26]

Lindlbauer D, Grønbæk JE, Birk M et al (2016) Combining shape-changing interfaces and spatial augmented reality enables extended object appearance. In: Proceedings of the 2016 CHI conference on human factors in computing systems. Association for Computing Machinery, New York, pp 791–802. https://doi.org/10.1145/2858036.2858457

[27]

Liu S, Wang XV, Wang L (2022a) Digital twin-enabled advance execution for human–robot collaborative assembly. CIRP Ann 71(1):25–28

[28]

Liu X, Zheng L, Wang Y et al (2022b) Human-centric collaborative assembly system for large-scale space deployable mechanism driven by digital twins and wearable AR devices. J Manuf Syst 65:720–742

[29]

Liu YK, Ong SK, Nee AYC (2022c) State-of-the-art survey on digital twin implementations. Adv Manuf 10(1):1–23. https://doi.org/10.1007/s40436-021-00375-w

[30]

Lu Y, Wang H, Feng N Online interaction method of mobile robot based on single-channel EEG signal and end-to-end CNN with residual block model. Adv Eng Inform, 2022, 52: 101595

[31]

Mukherjee D, Gupta K, Chang LH A survey of robot learning strategies for human–robot collaboration in industrial settings. Robot Comput-Integr Manuf, 2022, 73: 102231

[32]

Müller F, Deuerlein C, Koch M. Cyber-physical-system for representing a robot end effector. Procedia CIRP, 2021, 100: 307-312

[33]

Okegbile SD, Cai J, Yi C Human digital twin for personalized healthcare: vision, architecture and future directions. IEEE Netw, 2022, 37: 262-269

[34]

Ostanin M, Mikhel S, Evlampiev A, et al (2020) Human-robot interaction for robotic manipulator programming in mixed reality. In: 2020 IEEE international conference on robotics and automation (ICRA), pp 2805–2811. IEEE

[35]

Özgür A, Lemaignan S, Johal W et al (2017) Cellulo: versatile handheld robots for education. In: Proceedings of the 2017 ACM/IEEE international conference on human–robot interaction, pp 119–127

[36]

Paripooranan CS, Abishek R, Vivek D et al (2020) An implementation of AR enabled digital twins for 3-d printing. In: 2020 IEEE international symposium on smart electronic systems (iSES) (Formerly iNiS). IEEE, pp 155–160

[37]

Pedersen EW, Hornbæk K (2011) Tangible bots: interaction with active tangibles in tabletop interfaces. In: Proceedings of the SIGCHI conference on human factors in computing systems. Association for Computing Machinery, New York, pp 2975–2984. https://doi.org/10.1145/1978942.1979384

[38]

Peng H, Briggs J, Wang CY et al (2018) Roma: interactive fabrication with augmented reality and a robotic 3d printer. In: Proceedings of the 2018 CHI conference on human factors in computing systems, pp 1–12

[39]

Quintero CP, Li S, Pan MK et al (2018) Robot programming through augmented trajectories in augmented reality. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 1838–1844

[40]

Sifat MMH, Choudhury SM, Das SK Towards electric digital twin grid: technology and framework review. Energy AI, 2022, 11: 100213

[41]

Sisinni E, Saifullah A, Han S Industrial internet of things: challenges, opportunities, and directions. IEEE Trans Ind Inf, 2018, 14(11): 4724-4734

[42]

Siu AF, Yuan S, Pham H et al (2018) Investigating tangible collaboration for design towards augmented physical telepresence. In: Design thinking research: making distinctions: collaboration versus cooperation, pp 131–145

[43]

Suzuki R, Karim A, Xia T et al (2022) Augmented reality and robotics: a survey and taxonomy for AR-enhanced human–robot interaction and robotic interfaces. In: CHI conference on human factors in computing systems. ACM, New Orleans, pp 1–33. https://doi.org/10.1145/3491102.3517719

[44]

Takashima K, Oyama T, Asari Y et al (2016) Study and design of a shape-shifting wall display. In: Proceedings of the 2016 ACM conference on designing interactive systems, pp 796–806

[45]

Tao F, Sui F, Liu A Digital twin-driven product design framework. Int J Prod Res, 2019, 57(12): 3935-3953

[46]

Tuegel EJ, Ingraffea AR, Eason TG Reengineering aircraft structural life prediction using a digital twin. Int J Aerosp Eng, 2011, 2011: 1687-5966

[47]

Villani V, Pini F, Leali F Survey on human-robot collaboration in industrial settings: safety, intuitive interfaces and applications. Mechatronics, 2018, 55: 248-266

[48]

Wan J, Li X, Dai HN Artificial-intelligence-driven customized manufacturing factory: key technologies, applications, and challenges. Proc IEEE, 2020, 109(4): 377-398

[49]

Wang B, Zheng P, Yin Y Toward human-centric smart manufacturing: a human-cyber-physical systems (hcps) perspective. J Manuf Syst, 2022, 63: 471-490

[50]

Watanabe A, Ikeda T, Morales Y et al (2015) Communicating robotic navigational intentions. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 5763–5769. https://doi.org/10.1109/IROS.2015.7354195

[51]

Yin Y, Zheng P, Li C A state-of-the-art survey on Augmented Reality-assisted Digital Twin for futuristic human-centric industry transformation. Robot Comput-Integr Manuf, 2023, 81: 102515

[52]

Yuan L, Reardon C, Warnell G Human gaze-driven spatial tasking of an autonomous MAV. IEEE Robot Autom Lett, 2019, 4(2): 1343-1350

Funding

Guangdong Provincial Department of Science and Technology(2021QN02Z112)

AI Summary AI Mindmap
PDF

435

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/