PDF
Abstract
Ubiquitous augmented reality (UAR) implementation can benefit smart shop floor operations significantly. UAR from a user’s first-person view can support and provide the user with suitable and comprehensive information without him/her being distracted from ongoing tasks. A natural hand-based interaction interface, namely, a mobile bare-hand interface (MBHI), is proposed to assist a user in exploring and navigating a large amount of information for a task in the user’s first-person view. The integration of a smart shop floor and UAR-based MBHI is particularly challenging. A real shop floor environment is composed of challenging conditions for the implementation of UAR, e.g., messy backgrounds and significant changes in illumination conditions. Meanwhile, the MBHI is required to provide precise and quick responses to minimize the difficulty of a user’s task. In this study, a wearable UAR system integrated with an MBHI is proposed to augment the shop floor environment with smart information. A case study is implemented to demonstrate the practicality and effectiveness of the proposed UAR and MBHI system.
Keywords
Augmented reality (AR)
/
Mobile bare-hand interaction (MBHI)
/
Smart objects (SOs)
/
Manufacturing shop-floor
Cite this article
Download citation ▾
S. K. Ong, X. Wang, A. Y. C. Nee.
3D bare-hand interactions enabling ubiquitous interactions with smart objects.
Advances in Manufacturing, 2020, 8(2): 133-143 DOI:10.1007/s40436-020-00295-1
| [1] |
Monostori L, Kadar B, Bauernhansl T, et al. Cyber-physical systems in manufacturing. CIRP Ann Manuf Technol, 2016, 65(2): 621-641.
|
| [2] |
Lee CKM, Zhang SZ, Ng KKH. Development of an industrial Internet of things suite for smart factory towards re-industrialization. Adv Manuf, 2017, 5: 335-343.
|
| [3] |
Yew AWW, Ong SK, Nee AYC. Towards a griddable distributed manufacturing system with augmented reality interfaces. Robot Comput Integr Manuf, 2016, 39: 43-55.
|
| [4] |
Voinov N, Chernorutsky I, Drobintsev P, et al. An approach to net-centric control automation of technological processes within industrial IoT systems. Adv Manuf, 2017, 5: 388-393.
|
| [5] |
Kortuem G, Kawsar F, Sundrammorthy V. Smart objects as building blocks for the internet of things. IEEE Internet Comput, 2010, 14(1): 44-51.
|
| [6] |
Burmeister D, Burmann F, Schrader A (2017) The smart object description language: modeling interaction capabilities for self-reflection. In: Proceedings of 2017 IEEE international conference on pervasive computing and communications workshops, pp 503–508
|
| [7] |
Grubert J, Langlotz T, Zollmann S, et al. Towards pervasive augmented reality: context-awareness in augmented reality. IEEE Trans Vis Comput Graph, 2016, 23(6): 1706-1724.
|
| [8] |
Wang X, Ong SK, Nee AYC. A comprehensive survey of augmented reality assembly research. Adv Manuf, 2016, 4(1): 1-22.
|
| [9] |
Teyssier M, Cliquet G, Richir S (2016) ArLive: unified approach of interaction between users, operable space and smart objects. In: Proceedings of the 2016 virtual reality international conference, ACM
|
| [10] |
Wirtz H, Ruth J, Serror M et al (2015) Enabling ubiquitous interaction with smart things. In: Proceedings of 2015 12th annual IEEE international conference on sensing, communication, and networking, pp 256–264
|
| [11] |
Wang X, Yew AWW, Ong SK, et al. Enhancing smart shop floor management with ubiquitous augmented reality. Int J Prod Res, 2019.
|
| [12] |
Sollich H, von Zadow U, Pietzsch T et al (2016) Exploring time-dependent scientific data using spatially aware mobiles and large displays. In: Proceedings of the 2016 ACM on interactive surfaces and spaces, pp 349–354
|
| [13] |
Besancon L, Issartel P, Ammi M, et al. Hybrid tactile/tangible interaction for 3D data exploration. IEEE Trans Vis Comput Graph, 2017, 23(1): 881-890.
|
| [14] |
Chong JWS, Ong SK, Nee AYC. Methodologies for immersive robot programming in an augmented reality environment. Int J Virtual Real, 2007, 6(1): 69-79.
|
| [15] |
Krichenbauer M, Yamamoto G, Taketomi T et al (2014) Towards augmented reality user interfaces in 3D media production. In: Proceedings of 2014 IEEE international symposium on mixed and augmented reality (ISMAR 2014), pp 23–28
|
| [16] |
Feng L, Yang X, Xiao S (2017) MagicToon: a 2D-to-3D creative cartoon modeling system with mobile AR. In: Proceedings of 2017 IEEE virtual reality, pp 195–204
|
| [17] |
Wang X, Ong SK, Nee AYC. Real-virtual components interaction for assembly simulation and planning. Robot Comput Integr Manuf, 2016, 41: 102-114.
|
| [18] |
Wang X, Ong SK, Nee AYC. Multi-modal augmented-reality assembly guidance based on bare-hand interface. Adv Eng Inf, 2016, 30(3): 406-421.
|
| [19] |
Arroyave-Tobon S, Osorio-Gomez G, Cardona-McCormick JF. Air-modeling: a tool for gesture-based solid modelling in context during early design stages in AR environments. Comput Ind, 2015, 66: 73-81.
|
| [20] |
Lin S, Cheng HF, Li W, et al. Ubii: physical world interaction through augmented reality. IEEE Trans Mobile Comput, 2017, 16(3): 872-885.
|
| [21] |
Korinth M, Sommer-Dittrich T, Reichert M et al (2019) Design and evaluation of a virtual reality-based car configuration concept. In: CVC 2019 advances in computer vision, pp 169–189
|
| [22] |
Velaz Y, Arce JR, Gutierrez T, et al. The influence interaction technology on the learning of assembly tasks using virtual reality. J Comput Inf Sci Eng, 2014, 14(4): 041007.
|
| [23] |
Valentini PP. Interactive virtual assembling in augmented reality. Int J Interact Des Manuf, 2009, 3(2): 109-119.
|
| [24] |
Dhawale P, Masoodian M, Rogers B (2006) Bare-hand 3D gesture input to interactive systems. In: Proceedings of the 7th international conference on NZ Chapter of the ACM’s SIG on human-computer interaction, vol 158, pp 25–32
|
| [25] |
Wang RY, Paris S, Popovic J (2011) 6D hands: Markerless hand tracking for computer aided design. In: Proceedings of the 24th annual ACM symposium on user interface software and technology, pp 549–557
|
| [26] |
Hilliges O, Izadi S, Wilson AD et al (2009) Interactions in the air: adding further depth to interactive tabletops. In: Proceedings of the 22nd annual ACM symposium on user interface software and technology, pp 139–148
|
| [27] |
Kojima Y, Yasumuro Y, Sasaki H et al (2001) Hand manipulation of virtual objects in wearable augmented reality. In: Proceedings of the seventh international conference on virtual systems and multimedia, pp 463–469
|
| [28] |
Erkoyuncu JA, del Amo IF, Mura MD, et al. Improving efficiency of industrial maintenance with context aware adaptive authoring in augmented reality. CIRP Ann Manuf Technol, 2017, 66(1): 465-468.
|
| [29] |
Mourtzis D, Vlachou A, Zogopoulos V. Cloud-based augmented reality remote maintenance through shop-floor monitoring: a product-service system approach. J Manuf Sci Eng, 2017, 139(6): 061011.
|
| [30] |
Bergesio L, Bernardos AM, Casar JR. An object-oriented model for object orchestration in smart environments. Proc Comput Sci, 2017, 109: 440-447.
|
| [31] |
Yachir A, Djamaa B, Mecheti A, et al. A comprehensive semantic model for smart object description and request resolution in the internet of things. Proc Comput Sci, 2016, 83: 147-154.
|
| [32] |
Hu W, Sun Y, Li X et al (2016) Cam-shift target tracking approach based on back projection method with distance weights. In: Proceedings of 2016 IEEE international conference on wavelet analysis and pattern recognition (ICWAPR), pp 252–257
|
| [33] |
https://opencv.org. Accessed on Oct 19 2017
|
| [34] |
Rautaray SS, Agrawal A. Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev, 2015, 43(1): 1-54.
|
| [35] |
Shen Y, Ong SK, Nee AYC. Vision-based hand interaction in augmented reality environment. Int J Hum Comput Interact, 2011, 27(7): 523-544.
|
Funding
Singapore A*STAR Agency for Science, Technology and Research(Project No. 1521200081)