Ahigh precision visual localization sensor and its workingmethodology for an indoor mobile robot<FootNote> Project supported by the National High-Tech R&D Program (863) of China (No. 2009AA04Z220), the National Natural Science Foundation of China (No. 61375084), and the Key Program of Shandong Provincial Natural Science Foundation, China (No. ZR2015QZ08) </FootNote>

Feng-yu ZHOU, Xian-feng YUAN, Yang YANG, Zhi-fei JIANG, Chen-lei ZHOU

PDF(1149 KB)
PDF(1149 KB)
Front. Inform. Technol. Electron. Eng ›› 2016, Vol. 17 ›› Issue (4) : 365-374. DOI: 10.1631/FITEE.1500272

Ahigh precision visual localization sensor and its workingmethodology for an indoor mobile robot<FootNote> Project supported by the National High-Tech R&D Program (863) of China (No. 2009AA04Z220), the National Natural Science Foundation of China (No. 61375084), and the Key Program of Shandong Provincial Natural Science Foundation, China (No. ZR2015QZ08) </FootNote>

Author information +
History +

Abstract

To overcome the shortcomings of existing robot localization sensors, such as low accuracy and poor robustness, a high precision visual localization system based on infrared-reflective artificial markers is designed and illustrated in detail in this paper. First, the hardware system of the localization sensor is developed. Secondly, we design a novel kind of infrared-reflective artificial marker whose characteristics can be extracted by the acquisition and processing of the infrared image. In addition, a confidence calculation method for marker identification is proposed to obtain the probabilistic localization results. Finally, the autonomous localization of the robot is achieved by calculating the relative pose relation between the robot and the artificial marker based on the perspective-3-point (P3P) visual localization algorithm. Numerous experiments and practical applications show that the designed localization sensor system is immune to the interferences of the illumination and observation angle changes. The precision of the sensor is ±1.94 cm for position localization and ±1.64◦ for angle localization. Therefore, it satisfies perfectly the requirements of localization precision for an indoor mobile robot.

Keywords

Mobile robot / Localization sensor / Visual localization / Infrared-reflective marker / Embedded system

Cite this article

Download citation ▾
Feng-yu ZHOU, Xian-feng YUAN, Yang YANG, Zhi-fei JIANG, Chen-lei ZHOU. Ahigh precision visual localization sensor and its workingmethodology for an indoor mobile robot<FootNote> Project supported by the National High-Tech R&D Program (863) of China (No. 2009AA04Z220), the National Natural Science Foundation of China (No. 61375084), and the Key Program of Shandong Provincial Natural Science Foundation, China (No. ZR2015QZ08) </FootNote>. Front. Inform. Technol. Electron. Eng, 2016, 17(4): 365‒374 https://doi.org/10.1631/FITEE.1500272

References

[1]
Aleksandrovich, Y.D., Gennadievich, P.G., Stepanovich, K.A., , 2013. Mobile robot navigation based on artificial landmarks with machine vision system. World Appl. Sci. J., 24(11):1467–1472.
[2]
Her, K.W., Kim, D.H., Ha, J.E., 2012. Localization of mobile robot using laser range finder and IR landmark. Proc. 12th Int. Conf. on Control, Automation and Systems, p.459–461.
[3]
Kim, Y., Yoon, W.C., 2014. Generating task-oriented interactions of service robots. IEEE Trans. Syst. Man Cybern., 44(8):981–994. http://dx.doi.org/10.1109/TSMC.2014.2298214
[4]
Kroumov, V., Okuyama, K., 2012. Localisation and position correction for mobile robot using artificial visual landmarks. Int. J. Adv. Mech. Syst., 4(2):112–119. http://dx.doi.org/10.1504/IJAMECHS.2012.048395
[5]
Lu, W., Xiang, Z., Liu, J., 2015. Design of an enhanced visual odometry by building and matching compressive panoramic landmarks online. Front. Inform. Technol. Electron. Eng., 16(2):152–165. http://dx.doi.org/10.1631/FITEE.1400139
[6]
Lu, Y., Song, D., 2015. Visual navigation using heterogeneous landmarks and unsupervised geometric constraints. IEEE Trans. Robot., 31(3):736–749. http://dx.doi.org/10.1109/TRO.2015.2424032
[7]
Luo, R.C., Chen, O., 2013. Wireless and pyroelectric sensory fusion system for indoor human/robot localization and monitoring. IEEE/ASME Trans. Mech., 18(3):845–853. http://dx.doi.org/10.1109/TMECH.2012.2188300
[8]
Luo, R.C., Lai, C.C., 2014. Multisensor fusion-based concurrent environment mapping and moving object detection for intelligent service robotics. IEEE Trans. Ind. Electron., 61(8):4043–4051. http://dx.doi.org/10.1109/TIE.2013.2288199
[9]
Müller, J., Burgard, W., 2013. Efficient probabilistic localization for autonomous indoor airships using sonar, air flow, and IMU sensors. Adv. Robot., 27(9):711–724. http://dx.doi.org/10.1080/01691864.2013.779005
[10]
Nakamura, T., Suzuki, S., 2014. Simplified EKF-SLAM by combining laser range sensor with retro reflective markers for use in kindergarten. Int. J. Robot. Mech., 1(1):1–7.
[11]
Oh, J.H., Kim, D., Lee, B.H., 2014. An indoor localization system for mobile robots using an active infrared positioning sensor. J. Ind. Intell. Inform., 2(1):35–38. http://dx.doi.org/10.12720/jiii.2.1.35-38
[12]
Persson, S.M., Sharf, I., 2014. Sampling-based A algorithm for robot path-planning. Int. J. Robot. Res., 33(13):1683–1708. http://dx.doi.org/10.1177/0278364914547786
[13]
Reinstein, M., Hoffmann, M., 2013. Dead reckoning in a dynamic quadruped robot based on multimodal proprioceptive sensory information. IEEE Trans. Robot., 29(2):563–571. http://dx.doi.org/10.1109/TRO.2012.2228309
[14]
Ren, Y., Ye, A., Lu, T., , 2014. A method of selflocalization of robot based on infrared landmark. Proc. 11th World Congress on Intelligent Control and Automation, p.5494–5499. http://dx.doi.org/10.1109/WCICA.2014.7053654
[15]
Sultan, M.S., Chen, X., Qadeer, N., , 2013. Vision guided path planning system for vehicles using infrared landmark. Proc. IEEE Int. Conf. on Robotics and Biomimetics, p.179–184. http://dx.doi.org/10.1109/ROBIO.2013.6739455
[16]
Vynnycky, M., Kanev, K., 2015. Mathematical analysis of the multisolution phenomenon in the P3P problem. J. Math. Imag. Vis., 51(2):326–337. http://dx.doi.org/10.1007/s10851-014-0525-0
[17]
Wu, H., Tian, G., Duan, P., , 2013. The design of a novel artificial label for robot navigation. Proc. Chinese Intelligent Automation Conf., p.479–487. http://dx.doi.org/10.1007/978-3-642-38460-8_53
[18]
Xu, D., Tan, M., Li, Y., 2011. Visual Measurement and Control for Robots. National Defense Industry Press, Beijing, China, p.132–136 (in Chinese).
[19]
Yu, H.H., Hsieh, H.W., Tasi, Y.K., , 2013. Visual localization for mobile robots based on composite map. J. Robot. Mech., 25(1):25–37.
[20]
Yuan, X., Song, M., Zhou, F., , 2015. A novel Mittag-Leffler kernel based hybrid fault diagnosis method for wheeled robot driving system. Comput. Intell. Neurosci., 2015:606734.1–606734.11. http://dx.doi.org/10.1155/2015/606734
[21]
Zhang, Z., 2000. A flexible new technique for camera calibration. IEEE Trans. Patt. Anal. Mach. Intell., 22(11): 1330–1334. http://dx.doi.org/10.1109/34.888718
PDF(1149 KB)

Accesses

Citations

Detail

Sections
Recommended

/