Performance analysis of visualmarkers for indoor navigation systems

Gaetano C. LA DELFA, Salvatore MONTELEONE, Vincenzo CATANIA, Juan F. DE PAZ, Javier BAJO

PDF(487 KB)
PDF(487 KB)
Front. Inform. Technol. Electron. Eng ›› 2016, Vol. 17 ›› Issue (8) : 730-740. DOI: 10.1631/FITEE.1500324
Article
Article

Performance analysis of visualmarkers for indoor navigation systems

Author information +
History +

Abstract

The massive diffusion of smartphones, the growing interest in wearable devices and the Internet of Things, and the exponential rise of location based services (LBSs) have made the problem of localization and navigation inside buildings one of the most important technological challenges of recent years. Indoor positioning systems have a huge market in the retail sector and contextual advertising; in addition, they can be fundamental to increasing the quality of life for citizens if deployed inside public buildings such as hospitals, airports, and museums. Sometimes, in emergency situations, they can make the difference between life and death. Various approaches have been proposed in the literature. Recently, thanks to the high performance of smartphones’ cameras, marker-less and marker-based computer vision approaches have been investigated. In a previous paper, we proposed a technique for indoor localization and navigation using both Bluetooth low energy (BLE) and a 2D visual marker system deployed into the floor. In this paper, we presented a qualitative performance evaluation of three 2D visual markers, Vuforia, ArUco marker, and AprilTag, which are suitable for real-time applications. Our analysis focused on specific case study of visual markers placed onto the tiles, to improve the efficiency of our indoor localization and navigation approach by choosing the best visual marker system.

Keywords

Indoor localization / Visual markers / Computer vision

Cite this article

Download citation ▾
Gaetano C. LA DELFA, Salvatore MONTELEONE, Vincenzo CATANIA, Juan F. DE PAZ, Javier BAJO. Performance analysis of visualmarkers for indoor navigation systems. Front. Inform. Technol. Electron. Eng, 2016, 17(8): 730‒740 https://doi.org/10.1631/FITEE.1500324

References

[1]
Aider, O.A., Hoppenot, P., Colle, E., 2005. A modelbased method for indoor mobile robot localization using monocular vision and straight-line correspondences. Robot. Auton. Syst., 52(2):229–246.
[2]
Arias, S., April, S., 2011. Visual Tag Recognition for Indoor Positioning. MS Thesis, Universitat Politècnica de Catalunya, Catalonia, Spain.
[3]
Bajo, J., de Paz, J.F., Villarrubia, G., , 2015. Selforganizing architecture for information fusion in distributed sensor networks. Int. J. Distrib. Sens. Netw., 11(3):1–13. http://dx.doi.org/10.1155/2015/231073
[4]
Beauregard, S., Haas, H., 2006. Pedestrian dead reckoning: a basis for personal positioning. Proc. 3rd Workshop on Positioning, Navigation and Communication, p.27–35. http://dx.doi.org/10.1186/1687-6180-2014-65
[5]
Bitsch Link, J.A., Gerdsmeier, F., Smith, P., , 2012. Indoor navigation on wheels (and on foot) using smartphones. Proc. Int. Conf. on Indoor Positioning and Indoor Navigation, p.1–10. http://dx.doi.org/10.1109/IPIN.2012.6418931
[6]
Buchman, A., Lung, C., 2013. Received signal strength based room level accuracy indoor localisation method. IEEE Int. Conf. on Cognitive Infocommunications, p.103–108. http://dx.doi.org/10.1109/CogInfoCom.2013.6719222
[7]
Chandgadkar, A., Knottenbelt, W., 2013An Indoor Navigation System for Smartphones. MS Thesis, Imperial College London, London, UK.
[8]
Constandache, I., Choudhury, R.R., Rhee, I., 2010. Towards mobile phone localization without war-driving. Proc. IEEE Int. Conf. on Computer Communications, p.1–9. http://dx.doi.org/10.1109/INFCOM.2010.5462058
[9]
Danakis, C., Afgani, M., Povey, G., , 2012. Using a CMOS camera sensor for visible light communication. IEEE Global Communications Conf., p.1244–1248. http://dx.doi.org/10.1109/GLOCOMW.2012.6477759
[10]
Denso, W., 2010. QR-Code Standard. Available from http://www.denso-wave.com/qrcode/qrstandard-e.html
[11]
Ecklbauer, B.L., 2014. A Mobile Positioning System for Android Based on Visual Markers. PhD Thesis, University of North Texas, Hagenberg, Austria.
[12]
Fuchs, C., Aschenbruck, N., Martini, P., , 2011. Indoor tracking for mission critical scenarios: a survey. Pervas. Mob. Comput., 7(1):1–15. http://dx.doi.org/10.1016/j.pmcj.2010.07.001
[13]
Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F.J., , 2014. Automatic generation and detection of highly reliable fiducial markers under occlusion. Patt. Recogn., 47(6):2280–2292. http://dx.doi.org/10.1016/j.patcog.2014.01.005
[14]
Han, D., Jung, S., Lee, M., , 2014. Building a practical Wi-Fi-based indoor navigation system. IEEE Pervas. Comput., 13(2):72–79. http://dx.doi.org/10.1109/MPRV.2014.24
[15]
Haverinen, J., Kemppainen, A., 2009. A global selflocalization technique utilizing local anomalies of the ambient magnetic field. Int. Conf. on Robotics and Automation, p.3142–3147. http://dx.doi.org/10.1109/ROBOT.2009.5152885
[16]
Jovicic, A., Li, J., Richardson, T., 2013. Visible light communication: opportunities, challenges and the path to market. IEEE Commun. Mag., 51(12):26–32. http://dx.doi.org/10.1109/MCOM.2013.6685754
[17]
Kato, H., Billinghurst, M., 1999. Marker tracking and HMD calibration for a video-based augmented reality conferencing system. Proc. 2nd IEEE ACM Int. Workshop on Augmented Reality, p.85–94. http://dx.doi.org/10.1109/IWAR.1999.803809
[18]
La Delfa, G.C., Catania, V., 2014. Accurate indoor navigation using smartphone, bluetooth low energy and visual tags. Proc. 2nd Conf. on Mobile and Information Technologies in Medicine, p.1–4.
[19]
La Delfa, G.C., Catania, V., Monteleone, S., , 2015. Computer vision based indoor navigation: a visual markers evaluation. 6th Int. Symp. on Ambient Intelligence-Software and Applications, p.165–173. http://dx.doi.org/10.1007/978-3-319-19695-4_17
[20]
Li, F., Zhao, C., Ding, G., , 2012. A reliable and accurate indoor localization method using phone inertial sensors. Proc. ACM Conf. on Ubiquitous Computing, p.421–430. http://dx.doi.org/10.1145/2370216.2370280
[21]
Liu, Y., Wang, Q., Liu, J., , 2012. MCMC-based indoor localization with a smart phone and sparse WiFi access points. IEEE Int. Conf. on Pervasive Computing and Communications Workshops, p.247–252. http://dx.doi.org/10.1109/PerComW.2012.6197488
[22]
Liu, Y., Dashti, M., Zhang, J., 2013. Indoor localization on mobile phone platforms using embedded inertial sensors. 10th Workshop on Positioning Navigation and Communication, p.1–5. http://dx.doi.org/10.1109/WPNC.2013.6533266
[23]
Longacre, A., Hussey, R., 1997. Two Dimensional Data Encoding Structure and Symbology for Use with Optical Readers. US Patent 5 591 956.
[24]
Mandal, A., Lopes, C.V., Givargis, T., , 2005. Beep: 3D indoor positioning using audible sound. IEEE 2nd Consumer Communications and Networking Conf., p.348–353. http://dx.doi.org/10.1109/CCNC.2005.1405195
[25]
Martin, P., Ho, B.J., Grupen, N., , 2014. An iBeacon primer for indoor localization: demo abstract. Proc. 1st ACM Conf. on Embedded Systems for Energy-Efficient Buildings, p.190–191. http://dx.doi.org/10.1145/2674061.2675028
[26]
Mautz, R., 2012. Indoor Positioning Technologies. Südwestdeutscher Verlag für Hochschulschriften.
[27]
Meingast, M., Geyer, C., Sastry, S., 2005. Geometric models of rolling-shutter cameras. Computer Vision and Pattern Recognition, ePrint Archive, arXiv:cs/0503076. Available from http://arxiv.org/abs/cs/0503076
[28]
Mohan, A., Woo, G., Hiura, S., , 2009. Bokode: imperceptible visual tags for camera based interaction from a distance. ACM Trans. Graph., 28(3):98.1–98.8. http://dx.doi.org/10.1145/1576246.1531404
[29]
Mulloni, A., Wagner, D., Barakonyi, I., , 2009. Indoor positioning and navigation with camera phones. IEEE Pervas. Comput., 8(2):22–31. http://dx.doi.org/10.1109/MPRV.2009.30
[30]
Naimark, L., Foxlin, E., 2002. Circular data matrix fiducial system and robust image processing for a wearable vision-inertial self-tracker. Proc. 1st Int. Symp. On Mixed and Augmented Reality, p.27–36. http://dx.doi.org/10.1109/ISMAR.2002.1115065
[31]
Olson, E., 2011. AprilTag: a robust and flexible visual fiducial system. Proc. IEEE Int. Conf. on Robotics and Automation, p.3400–3407. http://dx.doi.org/10.1109/ICRA.2011.5979561
[32]
Qualcomm, 2014. Qualcomm Vuforia. Available from https://developer.vuforia.com/
[33]
Richardson, A., Strom, J., Olson, E., 2013. AprilCal: assisted and repeatable camera calibration. Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, p.1814–1821. http://dx.doi.org/10.1109/IROS.2013.6696595
[34]
Saito, S., Hiyama, A., Tanikawa, T., , 2007. Indoor marker-based localization using coded seamless pattern for interior decoration. IEEE Virtual Reality Conf., p.67–74. http://dx.doi.org/10.1109/VR.2007.352465
[35]
Subbu, P., Sasidhar, K., 2011. Indoor Localization Using Magnetic Fields. PhD Thesis, University of North Texas, Texas, USA.
[36]
Tarzia, S.P., Dinda, P.A., Dick, R.P., , 2011. Indoor localization without infrastructure using the acoustic background spectrum. Proc. 9th Int. Conf. on Mobile Systems, Applications, and Services, p.155–168. http://dx.doi.org/10.1145/1999995.2000011
[37]
Torres-Solis, J., Falk, T.H., Chau, T., 2010. A Review of Indoor Localization Technologies: Towards Navigational Assistance for Topographical Disorientation. In: Molina, F.J.V. (Ed.), Ambient Intelligence. In-Tech Open Access Publisher, Rijeka, Croatia, p.51–83. http://dx.doi.org/doi:10.5772/8678
[38]
Villarrubia, G., Bajo, J., de Paz, J.F., , 2014. Monitoring and detection platform to prevent anomalous situations in home care. Sensor, 14(6):9900–9921. http://dx.doi.org/10.1145/1999995.2000011
[39]
Wang, H., Sen, S., Elgohary, A., , 2012. No need to war-drive: unsupervised indoor localization. Proc. 10th Int. Conf. on Mobile Systems, Applications, and Services, p.197–210. http://dx.doi.org/10.1145/2307636.2307655
[40]
Wicker, S.B., Bhargava, V.K., 1994. Reed-Solomon Codes and Their Applications. IEEE Press, Piscataway, NJ, USA.
[41]
Zachariah, D., Jansson, M., 2012. Fusing visual tags and inertial information for indoor navigation. IEEE/ION Position Location and Navigation Symp., p.535–540. http://dx.doi.org/10.1109/PLANS.2012.6236924

RIGHTS & PERMISSIONS

2016 Zhejiang University and Springer-Verlag Berlin Heidelberg
PDF(487 KB)

Accesses

Citations

Detail

Sections
Recommended

/