AR-enhanced digital twin for human–robot interaction in manufacturing systems
Zhongyuan Liao , Yi Cai
Energy, Ecology and Environment ›› : 1 -19.
AR-enhanced digital twin for human–robot interaction in manufacturing systems
The integration of advanced technologies into manufacturing processes is critical for addressing the complexities of modern industrial environments. In particular, the realm of human–robot interaction (HRI) faces the challenge of ensuring that human operators can effectively collaborate with increasingly sophisticated robotic systems. Traditional interfaces often fall short of providing the intuitive, real-time interaction necessary for optimal performance and safety. To address this issue, we introduce a novel system that combines digital twin (DT) technology with augmented reality (AR) to enhance HRI in manufacturing settings. The proposed AR-based DT system creates a dynamic virtual model of robot operations, offering an immersive interface that overlays crucial information onto the user’s field of vision. This approach aims to bridge the gap between human operators and robotic systems, improving spatial awareness, task guidance, and decision-making processes. Our system is designed to operate at three distinct levels of DT functionality: the virtual twin for in-situ monitoring, the hybrid twin for intuitive interaction, and the cognitive twin for optimized operation. By leveraging these levels, the system provides a comprehensive solution that ranges from basic visualization to advanced predictive analytics. The effectiveness of the AR-based DT system is demonstrated through a human-centric user study conducted in manufacturing scenarios. The results show a significant reduction in operational time and errors, alongside an enhancement of the overall user experience. These findings confirm the potential of our system to transform HRI by providing a safer, more efficient, and more adaptable manufacturing environment. Our research contributes to the advancement of smart manufacturing by evidencing the synergistic benefits of integrating DT and AR into HRI.
Digital twin / Intuitive interface / Augmented reality / Human–robot interaction / Human-centricity
| [1] |
Amtsberg F, Yang X, Skoury L et al (2021) iHRC: an AR-based interface for intuitive, interactive and coordinated task sharing between humans and robots in building construction. In: ISARC. Proceedings of the international symposium on automation and robotics in construction. IAARC Publications, pp 25–32 |
| [2] |
Arevalo Arboleda S, Rücker F, Dierks T et al (2021) Assisting manipulation and grasping in robot teleoperation with augmented reality visual cues. In: Proceedings of the 2021 CHI conference on human factors in computing systems, pp 1–14 |
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
Cao Y, Xu Z, Li F et al (2019) V. ra: an in-situ visual authoring system for robot-IoT task planning with augmented reality. In: Proceedings of the 2019 on designing interactive systems conference, pp 1059–1070 |
| [10] |
Cauchard JR, Tamkin A, Wang CY et al (2019) Drone. io: a gestural and visual interface for human-drone interaction. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 153–162 |
| [11] |
Chen L, Takashima K, Fujita K et al (2021) Pinpointfly: an egocentric position-control drone interface using mobile ar. In: Proceedings of the 2021 CHI conference on human factors in computing systems. Association for Computing Machinery, New York. https://doi.org/10.1145/3411764.3445110 |
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
Fuste A, Reynolds B, Hobin J et al (2020) Kinetic ar: a framework for robotic motion systems in spatial computing. In: Extended abstracts of the 2020 CHI conference on human factors in computing systems. Association for Computing Machinery, New York, pp. 1–8. https://doi.org/10.1145/3334480.3382814, |
| [16] |
Grieves M, Vickers J (2017) Digital twin: mitigating unpredictable, undesirable emergent behavior in complex systems. Transdisciplinary perspectives on complex systems: new findings and approaches, pp 85–113 |
| [17] |
|
| [18] |
Hassenzahl M, Burmester M, Koller F (2003) Attrakdiff: Ein fragebogen zur messung wahrgenommener hedonischer und pragmatischer qualität. Mensch & Computer 2003: Interaktion in Bewegung, pp 187–196 |
| [19] |
Hedayati H, Walker M, Szafir D (2018) Improving collocated robot teleoperation with augmented reality. In: Proceedings of the 2018 ACM/IEEE international conference on human-robot interaction, pp 78–86 |
| [20] |
Hoang KC, Chan WP, Lay S et al (2022) Virtual barriers in augmented reality for safe and effective human-robot cooperation in manufacturing. In: 2022 31st IEEE international conference on robot and human interactive communication (RO-MAN). IEEE, pp 1174–1180 |
| [21] |
|
| [22] |
ISO I (2020) Dis 23247-1 automation systems and integration-digital twin framework for manufacturing. International Organization for Standardization, Geneva |
| [23] |
Kuts V, Otto T, Tähemaa T et al (2018) Adaptive industrial robots using machine vision. In: ASME international mechanical engineering congress and exposition. American Society of Mechanical Engineers, p V002T02A093 |
| [24] |
Leng J, Zhu X, Huang Z et al (2023) Manuchain II: blockchained smart contract system as the digital twin of decentralized autonomous manufacturing toward resilience in industry 5.0. IEEE Trans Syst Man Cybern: Syst |
| [25] |
|
| [26] |
Lindlbauer D, Grønbæk JE, Birk M et al (2016) Combining shape-changing interfaces and spatial augmented reality enables extended object appearance. In: Proceedings of the 2016 CHI conference on human factors in computing systems. Association for Computing Machinery, New York, pp 791–802. https://doi.org/10.1145/2858036.2858457 |
| [27] |
Liu S, Wang XV, Wang L (2022a) Digital twin-enabled advance execution for human–robot collaborative assembly. CIRP Ann 71(1):25–28 |
| [28] |
Liu X, Zheng L, Wang Y et al (2022b) Human-centric collaborative assembly system for large-scale space deployable mechanism driven by digital twins and wearable AR devices. J Manuf Syst 65:720–742 |
| [29] |
Liu YK, Ong SK, Nee AYC (2022c) State-of-the-art survey on digital twin implementations. Adv Manuf 10(1):1–23. https://doi.org/10.1007/s40436-021-00375-w |
| [30] |
|
| [31] |
|
| [32] |
|
| [33] |
|
| [34] |
Ostanin M, Mikhel S, Evlampiev A, et al (2020) Human-robot interaction for robotic manipulator programming in mixed reality. In: 2020 IEEE international conference on robotics and automation (ICRA), pp 2805–2811. IEEE |
| [35] |
Özgür A, Lemaignan S, Johal W et al (2017) Cellulo: versatile handheld robots for education. In: Proceedings of the 2017 ACM/IEEE international conference on human–robot interaction, pp 119–127 |
| [36] |
Paripooranan CS, Abishek R, Vivek D et al (2020) An implementation of AR enabled digital twins for 3-d printing. In: 2020 IEEE international symposium on smart electronic systems (iSES) (Formerly iNiS). IEEE, pp 155–160 |
| [37] |
Pedersen EW, Hornbæk K (2011) Tangible bots: interaction with active tangibles in tabletop interfaces. In: Proceedings of the SIGCHI conference on human factors in computing systems. Association for Computing Machinery, New York, pp 2975–2984. https://doi.org/10.1145/1978942.1979384 |
| [38] |
Peng H, Briggs J, Wang CY et al (2018) Roma: interactive fabrication with augmented reality and a robotic 3d printer. In: Proceedings of the 2018 CHI conference on human factors in computing systems, pp 1–12 |
| [39] |
Quintero CP, Li S, Pan MK et al (2018) Robot programming through augmented trajectories in augmented reality. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 1838–1844 |
| [40] |
|
| [41] |
|
| [42] |
Siu AF, Yuan S, Pham H et al (2018) Investigating tangible collaboration for design towards augmented physical telepresence. In: Design thinking research: making distinctions: collaboration versus cooperation, pp 131–145 |
| [43] |
Suzuki R, Karim A, Xia T et al (2022) Augmented reality and robotics: a survey and taxonomy for AR-enhanced human–robot interaction and robotic interfaces. In: CHI conference on human factors in computing systems. ACM, New Orleans, pp 1–33. https://doi.org/10.1145/3491102.3517719 |
| [44] |
Takashima K, Oyama T, Asari Y et al (2016) Study and design of a shape-shifting wall display. In: Proceedings of the 2016 ACM conference on designing interactive systems, pp 796–806 |
| [45] |
|
| [46] |
|
| [47] |
|
| [48] |
|
| [49] |
|
| [50] |
Watanabe A, Ikeda T, Morales Y et al (2015) Communicating robotic navigational intentions. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 5763–5769. https://doi.org/10.1109/IROS.2015.7354195 |
| [51] |
|
| [52] |
|
/
| 〈 |
|
〉 |