Digital twin- and extended reality-based telepresence for collaborative robot programming in the 6G perspective

Davide Calandra , F. Gabriele Pratticò , Alberto Cannavò , Claudio Casetti , Fabrizio Lamberti

›› 2024, Vol. 10 ›› Issue (2) : 315 -327.

PDF
›› 2024, Vol. 10 ›› Issue (2) :315 -327. DOI: 10.1016/j.dcan.2022.10.007
Research article
research-article

Digital twin- and extended reality-based telepresence for collaborative robot programming in the 6G perspective

Author information +
History +
PDF

Abstract

In the context of Industry 4.0, a paradigm shift from traditional industrial manipulators to Collaborative Robots (CRs) is ongoing, with the latter serving ever more closely humans as auxiliary tools in many production processes. In this scenario, continuous technological advancements offer new opportunities for further innovating robotics and other areas of next-generation industry. For example, 6G could play a prominent role due to its human-centric view of the industrial domains. In particular, its expected dependability features will pave the way for new applications exploiting highly effective Digital Twin (DT)- and eXtended Reality (XR)-based telepresence. In this work, a novel application for the above technologies allowing two distant users to collaborate in the programming of a CR is proposed. The approach encompasses demanding data flows (e.g., point cloud-based streaming of collaborating users and robotic environment), with network latency and bandwidth constraints. Results obtained by analyzing this approach from the viewpoint of network requirements in a setup designed to emulate 6G connectivity indicate that the expected performance of forthcoming mobile networks will make it fully feasible in principle.

Keywords

6G / Digital twin / Telepresence / Collaborative robots / Industry 4.0 / Augmented reality / Virtual reality / Point cloud streaming

Cite this article

Download citation ▾
Davide Calandra, F. Gabriele Pratticò, Alberto Cannavò, Claudio Casetti, Fabrizio Lamberti. Digital twin- and extended reality-based telepresence for collaborative robot programming in the 6G perspective. , 2024, 10(2): 315-327 DOI:10.1016/j.dcan.2022.10.007

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

R. Galin, R. Meshcheryakov, Automation and robotics in the context of Industry 4.0: the shift to collaborative robots,in:Proceedings of the IOP Conference Series: Materials Science and Engineering, IOP Publishing, 2019, pp. 1-5.

[2]

F.G. Pratticò, F. Lamberti, Towards the adoption of virtual reality training systems for the self-tuition of industrial robot operators: a case study at KUKA, Comput. Ind. 129 (2021) 103446.

[3]

S. El Zaatari, M. Marei, W. Li, Z. Usman, Cobot programming for collaborative industrial tasks: an overview, Robot. Autonom. Syst. 116 (2019) 162-180.

[4]

M. Safeea, P. Neto, KUKA Sunrise toolbox: interfacing collaborative robots with MATLAB, IEEE Robot. Autom. Mag. 26 (1) (2019) 91-96.

[5]

P. Gustavsson, A. Syberfeldt, R. Brewster, L. Wang, Human-robot collaboration demonstrator combining speech recognition and haptic control, in: Proceeding of the 50th CIRP Conference on Manufacturing Systems, Elsevier, 2017, pp. 396-401.

[6]

Y. Wang, Y. Hu, S. El Zaatari, W. Li, Y. Zhou, Optimised learning from demonstrations for collaborative robots, Robot. Comput. Integrated Manuf. 71 (2021) 102169.

[7]

G. Ajaykumar, M. Stiber, C.-M. Huang, Designing user-centric programming aids for kinesthetic teaching of collaborative robots, Robot. Autonom. Syst. 145 (2021) 103845.

[8]

M. Vasarainen, S. Paavola, L. Vetoshkina, et al., A systematic literature review on extended reality: virtual, augmented and mixed reality in working life, Int. J. Virtual Real. 21 (2) (2021) 1-28.

[9]

P. Milgram, F. Kishino, A taxonomy of mixed reality visual displays, IEICE Trans. Info Syst. 77 (12) (1994) 1321-1329.

[10]

R.T. Azuma, A survey of augmented reality, Presence, Teleoperators & virtual environments 6 (4) (1997) 355-385.

[11]

M. Speicher, B.D. Hall, M. Nebeling,What is mixed reality?, in:Proceedings of the CHI Conference on Human Factors in Computing Systems ACM, 2019, pp. 1-15.

[12]

B. Han, H.D. Schotten,Multi-sensory HMI to enable digital twins with human-in-loop: a 6G vision of future industry. https://doi.org/10.48550/arXiv.2111.10438, 2021. (Accessed 28 September 2022).

[13]

G.P. Fettweis, H. Boche,6G: the personal tactile internet - and open questions for information theory, IEEE BITS the Information Theory Magazine 1 (1) (2021) 71-82.

[14]

M.A. Uusitalo, M. Ericson, B. Richerzhagen, E.U. Soykan, P. Rugeland, G. Fettweis, D. Sabella, G. Wikström, M. Boldi, M.-H. Hamon, et al., Hexa-X the European 6G flagship project, in: Proceeding of the Joint European Conference on Networks and Communications & 6G Summit, IEEE, 2021, pp. 580-585.

[15]

G. D'Aria, et al., Expanded 6G vision, use cases and societal values - including aspects of sustainability, security and spectrum. https://hexa-x.eu/d1-2-expanded-6g-vision-use-cases-and-societal-values-including-aspects-of-sustainability-security-and-spectrum/, 2021. (Accessed 28 September 2022).

[16]

M. Giordani, M. Polese, M. Mezzavilla, S. Rangan, M. Zorzi, Toward 6G networks: use cases and technologies, IEEE Commun. Mag. 58 (3) (2020) 55-61.

[17]

V. Ziegler, S. Yrjölä, How to make 6G a general purpose technology: prerequisites and value creation paradigm shift,in:Proceedings of the Joint European Conference on Networks and Communications & 6G Summit, IEEE, 2021, pp. 586-591.

[18]

W. Saad, M. Bennis, M. Chen,A vision of 6G wireless systems: applications, trends, technologies, and open research problems, IEEE Network 34 (3) (2019) 134-142.

[19]

K. Luo, S. Dang, B. Shihada, M.-S. Alouini, Prospect theory for human-centric communications, Frontiers in Communications and Networks 2 (2021) 8.

[20]

L.U. Khan, I. Yaqoob, M. Imran, Z. Han, C.S. Hong,6G wireless systems: a vision, architectural elements, and future directions, IEEE Access 8 (2020) 147029-147044.

[21]

S. Dang, O. Amin, B. Shihada, M.-S. Alouini,What should 6G be? Nature Electronics 3 (1) (2020) 20-29.

[22]

S. Ventura, E. Brivio, G. Riva, R.M. Ban-os, Immersive versus non-immersive experience: exploring the feasibility of memory assessment through 360 technology, Front. Psychol. 10 (2019) 2509.

[23]

Innovative Human-Robot Cooperation in BMW Group Production, 2013. https://www.press.bmwgroup.com/global/article/detail/T0209722EN/innovative-human-robot-cooperation-in-bmw-group-production. (Accessed 28 September 2022).

[24]

Human-robot Cooperation at Audi, 2017. https://www.springerprofessional.de/en/manufacturing/production-production-technology/human-robot-cooperation-at-audi/14221870. (Accessed 28 September 2022).

[25]

S. Nikolaidis, P. Lasota, R. Ramakrishnan, J. Shah, Improved human-robot team performance through cross-training, an approach inspired by human team training practices, Int. J. Robot Res. 34 (14) (2015) 1711-1730.

[26]

C.-M. Huang, M. Cakmak, B. Mutlu,Adaptive coordination strategies for human-robot handovers, in:Proceeding of the Robotics: Science and Systems Conference, vol. 11, Robotics:Science and Systems Foundation, 2015, pp. 1-10.

[27]

L. Johannsmeier, S. Haddadin, A hierarchical human-robot interaction-planning framework for task allocation in collaborative industrial assembly processes, IEEE Rob. Autom. Lett. 2 (1) (2016) 41-48.

[28]

V. Gabler, T. Stahl, G. Huber, O. Oguz, D. Wollherr, A game-theoretic approach for adaptive action selection in close proximity human-robot-collaboration, in: Proceedings of the IEEE Interantional Conference on Robotics and Automation, IEEE, 2017, pp. 2897-2903.

[29]

M. Wongphati, H. Osawa, M. Imai, Gestures for manually controlling a helping hand robot, International Journal of Social Robotics 7 (5) (2015) 731-742.

[30]

I. El Makrini, K. Merckaert, D. Lefeber, B. Vanderborght, Design of a collaborative architecture for human-robot assembly tasks, in: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, 2017, pp. 1624-1629.

[31]

A. Cherubini, R. Passama, P. Fraisse, A. Crosnier, A unified multimodal control framework for human-robot interaction, Robot. Autonom. Syst. 70 (2015) 106-115.

[32]

ISO/TS 15066: 2016 Robots and Robotic Devices - Collaborative Robots, Standard, International Organization for Standardization, 2016.

[33]

M.-P. Pacaux-Lemoine, D. Trentesaux,Ethical risks of human-machine symbiosis in Industry 4.0: insights from the human-machine cooperation approach, IFAC-PapersOnLine 52 (19) (2019) 19-24.

[34]

F. Schneemann, F. Diederichs, Action prediction with the Jordan model of human intention: a contribution to cooperative control, Cognit. Technol. Work 21 (4) (2019) 711-721.

[35]

S.A. Green, M. Billinghurst, X. Chen, J.G. Chase, Human-robot collaboration: a literature review and augmented reality approach in design, Int. J. Adv. Rob. Syst. 5 (1) (2008) 1.

[36]

F. De Pace, F. Manuri, A. Sanna, C. Fornaro, A systematic review of augmented reality interfaces for collaborative industrial robots, Comput. Ind. Eng. 149 (2020) 106806.

[37]

A. Hietanen, R. Pieters, M. Lanz, J. Latokartano, J.-K. Kämäräinen, AR-based interaction for human-robot collaborative manufacturing, Robot. Comput. Integrated Manuf. 63 (2020) 101891.

[38]

E. Rosen, D. Whitney, E. Phillips, G. Chien, J. Tompkin, G. Konidaris, S. Tellex, Communicating and controlling robot arm motion intent through mixed-reality head-mounted displays, Int. J. Robot Res. 38 (12-13) (2019) 1513-1526.

[39]

D. Calandra, A. Cannavò, F. Lamberti, Improving AR-powered remote assistance: a new approach aimed to foster operator's autonomy and optimize the use of skilled resources, Int. J. Adv. Manuf. Technol. 114 (9) (2021) 3147-3164.

[40]

J. Botev, F.J. Rodríguez Lera, Immersive robotic telepresence for remote educational scenarios, Sustainability 13 (9) (2021) 4717.

[41]

J. Trevelyan, W.R. Hamel, S.-C. Kang, Robotics in hazardous applications, in: Springer Handbook of Robotics, Springer, 2016, pp. 1521-1548.

[42]

D. Whitney, E. Rosen, E. Phillips, G. Konidaris, S. Tellex, Comparing robot grasping teleoperation across desktop and virtual reality with ROS reality, in: Robotics Research, Springer, 2020, pp. 335-350.

[43]

G. Yang, H. Lv, Z. Zhang, L. Yang, J. Deng, S. You, J. Du, H. Yang, Keep healthcare workers safe: application of teleoperated robot in isolation ward for covid-19 prevention and control, Chin. J. Mech. Eng. 33 (1) (2020) 1-4.

[44]

F. Perez-Grau, R. Ragel, F. Caballero, A. Viguria, A. Ollero, Semi-autonomous teleoperation of uavs in search and rescue scenarios, in: Proceedings of the 2017 International Conference on Unmanned Aircraft Systems (ICUAS), IEEE, 2017, pp. 1066-1074.

[45]

H. Lv, G. Yang, H. Zhou, X. Huang, H. Yang, Z. Pang, Teleoperation of collaborative robot for remote dementia care in home environments, IEEE Journal of Translational Engineering in Health and Medicine 8 (2020) 1-10.

[46]

J.O. Oyekan, W. Hutabarat, A. Tiwari, R. Grech, M.H. Aung, M.P. Mariani, L. López-Dávalos, T. Ricaud, S. Singh, C. Dupuis, The effectiveness of virtual environments in developing collaborative strategies between industrial robots and humans, Robot. Comput. Integrated Manuf. 55 (2019) 41-54.

[47]

L. Pérez, E. Diez, R. Usamentiaga, D.F. García, Industrial robot control and operator training using virtual reality interfaces, Comput. Ind. 109 (2019) 114-120.

[48]

Future of Extended Reality, 2022. https://assets.kpmg/content/dam/kpmg/au/pdf/2022/future-of-XR-white-paper.pdf. (Accessed 28 September 2022).

[49]

A. Clemm, M.T. Vega, H.K. Ravuri, T. Wauters, F. De Turck, Toward truly immersive holographic-type communication: challenges and solutions, IEEE Commun. Mag. 58 (1) (2020) 93-99.

[50]

M. Torres Vega, C. Liaskos, S. Abadal, E. Papapetrou, A. Jain, B. Mouhouche, G. Kalem, S. Ergüt, M. Mach, T. Sabol, A. Cabellos-Aparicio, C. Grimm, F. De Turck, J. Famaey, Immersive interconnected virtual and augmented reality: a 5G and IoT perspective, J. Netw. Syst. Manag. 28 (4) (2020) 796-826.

[51]

T. Kanter, U. Fors, R. Rahmani, Immersive networking - a framework for virtual environments with augmented reality in human decision-making, International Journal of Multimedia and Ubiquitous Engineering 11 (6) (2016) 43-60.

[52]

V. Pereira, T. Matos, R. Rodrigues, R. Nóbrega, J. Jacob, Extended reality framework for remote collaborative interactions in virtual environments, in: Proceedings of the International Conference on Graphics and Interaction (ICGI), IEEE, 2019, pp. 17-24.

[53]

F. De Pace, G. Gorjup, H. Bai, A. Sanna, M. Liarokapis, M. Billinghurst, Leveraging enhanced virtual reality methods and environments for efficient, intuitive, and immersive teleoperation of robots, in: Proceedings of the IEEE International Conference on Robotics and Automation, IEEE, 2021, pp. 12967-12973.

[54]

J.R. Paterson, J. Han, T. Cheng, P.H. Laker, D.L. McPherson, J. Menke, A.Y. Yang, Improving usability, efficiency, and safety of UAV path planning through a virtual reality interface, in: Proceeding of the Symposium on Spatial User Interaction, ACM, 2019, pp. 1-2.

[55]

T. Kot, P. Novák, Application of virtual reality in teleoperation of the military mobile robotic system TAROS, Int. J. Adv. Rob. Syst. 15 (1) (2018) 1729881417751545.

[56]

L. Gammieri, M. Schumann, L. Pelliccia, G. Di Gironimo, P. Klimant, Coupling of a redundant manipulator with a virtual reality environment to enhance human-robot cooperation, Procedia CIRP 62 (2017) 618-623.

[57]

E. Yigitbas, K. Karakaya, I. Jovanovikj, G. Engels, Enhancing human-in-the-loop adaptive systems through digital twins and VR interfaces, in: Proceedings of the 2021 International Symposium on Software Engineering for Adaptive and Self-Managing Systems (SEAMS), IEEE, 2021, pp. 30-40.

[58]

D. Whitney, E. Rosen, E. Phillips, G. Konidaris, S. Tellex, Comparing robot grasping teleoperation across desktop and virtual reality with ROS reality, in: Robotics Research, Springer, 2020, pp. 335-350.

[59]

S. Kohn, A. Blank, D. Puljiz, L. Zenkel, O. Bieber, B. Hein, J. Franke, Towards a real-time environment reconstruction for VR-based teleoperation through model segmentation, in: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, 2018, pp. 1-9.

[60]

S. Garrido-Jurado, R. Mu-noz Salinas, F. Madrid-Cuevas, R. Medina-Carnicer,Generation of fiducial marker dictionaries using mixed integer linear programming, Pattern Recogn. 51 (2016) 481-491. [61] F.J. Romero-Ramirez, R. Mu-noz Salinas, R. Medina-Carnicer, Speeded up detection of squared fiducial markers, Image Vis Comput. 76 (2018) 38-47.

[61]

Intel®Depth Resolution of Intel® RealSenseTM Depth Camera D435 and Intel® RealSenseTM Camera SR300, 2021. https://www.intel.com/content/www/us/en/support/articles/000026260/emerging-technologies/intel-realsense-technology.html. (Accessed 28 September 2022).

[62]

5G Performance - European 5G Observatory, 2021. https://5gobservatory.eu/info-deployments/5g-performance/. (Accessed 28 September 2022).

[63]

UK Mobile Performance in Review 2H 2020, 2021. https://rootmetrics.com/en-GB/content/uk-mobile-performance-in-review-2H-2020. (Accessed 28 September 2022).

[64]

5G vs,4G: How Will the Newest Network Improve on the Last?, 2022. https://www.digitaltrends.com/mobile/5g-vs-4g/. (Accessed 28 September 2022).

[65]

6G, Going beyond 100 Gbps to 1 Tbps, 2021. https://www.keysight.com/it/en/assets/7121-1152/white-papers/6G-Going-Beyond-100-Gbps-to-1-Tbps. (Accessed 28 September 2022).

[66]

A. Skalidi, et al., AI-Driven Communication & Computation Co-design: Gap Analysis and Blueprint, 2021. https://hexa-x.eu/wp-content/uploads/2021/09/Hexa-X-D4.1_v1.0.pdf. (Accessed 28 September 2022).

AI Summary AI Mindmap
PDF

98

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/