Active shape reconstruction using a novel visuotactile palm sensor

Jingyi Hu , Shaowei Cui , Shuo Wang , Rui Wang , Yu Wang

Biomimetic Intelligence and Robotics ›› 2024, Vol. 4 ›› Issue (3) : 100167 -100167.

PDF (1992KB)
Biomimetic Intelligence and Robotics ›› 2024, Vol. 4 ›› Issue (3) : 100167 -100167. DOI: 10.1016/j.birob.2024.100167
Research Article
research-article

Active shape reconstruction using a novel visuotactile palm sensor

Author information +
History +
PDF (1992KB)

Abstract

Tactile sensing enables high-precision 3D shape perception when vision is limited. However, tactile-based shape reconstruction remains a challenging problem. In this paper, a novel visuotactile sensor, GelStereo Palm 2.0, is proposed to better capture 3D contact geometry. Leveraging the dense tactile point cloud captured by GelStereo Palm 2.0, an active shape reconstruction pipeline is presented to achieve accurate and efficient 3D shape reconstruction on irregular surfaces. GelStereo Palm 2.0 achieves a spatial resolution of 1.5 mm and a reconstruction accuracy of 0.3 mm. The accuracy of the proposed active shape reconstruction pipeline reaches 2.3 mm within 18 explorations. The proposed method has potential applications in the shape reconstruction of transparent or underwater objects.

Keywords

Tactile sensing / Visuotactile sensor / Active shape reconstruction

Cite this article

Download citation ▾
Jingyi Hu, Shaowei Cui, Shuo Wang, Rui Wang, Yu Wang. Active shape reconstruction using a novel visuotactile palm sensor. Biomimetic Intelligence and Robotics, 2024, 4(3): 100167-100167 DOI:10.1016/j.birob.2024.100167

登录浏览全文

4963

注册一个新账户 忘记密码

CRediT authorship contribution statement

Jingyi Hu: Writing - review & editing, Writing - original draft, Visualization, Validation, Supervision, Software, Resources, Project administration, Methodology, Investigation, Funding acquisition, Formal analysis, Data curation, Conceptualization. Shaowei Cui: Writing - review & editing, Supervision, Funding acquisition, Formal analysis, Conceptualization. Shuo Wang: Writing - review & editing, Supervision, Funding acquisition, Conceptualization. Rui Wang: Writing - review & editing, Supervision, Funding acquisition, Conceptualization. Yu Wang: Funding acquisition.

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

This work was supported in part by the National Key Research and Development Program of China (2023YFB4705000), in part by the National Natural Science Foundation of (62303455, 62273342, and 62122087), in part by Beijing Natural Science Foundation (L233006).

References

[1]

S. Luo, J. Bimbo, R. Dahiya, H. Liu, Robotic tactile perception of object properties: A review, Mechatronics 48 (2017) 54-67.

[2]

H. Ham, J. Wesley, H. Hendra, Computer vision based 3D reconstruction: A review, Int. J. Electr. Comput. Eng. 9 (4) (2019) 2394.

[3]

H. Sun, K.J. Kuchenbecker, G. Martius, A soft thumb-sized vision-based sensor with accurate all-round force perception, Nat. Mach. Intell. 4 (2)(2022) 135-145.

[4]

M. Meier, M. Schopfer, R. Haschke, H. Ritter, A probabilistic approach to tactile shape reconstruction, IEEE Trans. Robot. 27 (3) (2011) 630-635.

[5]

U. Martinez-Hernandez, T.J. Dodd, M.H. Evans, T.J. Prescott, N.F. Lepora, Active sensorimotor control for tactile exploration, Robot. Auton. Syst. 87 (2017) 15-27.

[6]

S. Dragiev, M. Toussaint, M. Gienger, Gaussian process implicit surfaces for shape estimation and grasping, in: 2011 IEEE International Conference on Robotics and Automation, IEEE, 2011, pp. 2845-2850.

[7]

M. Björkman, Y. Bekiroglu, V. Högman, D. Kragic, Enhancing visual per-ception of shape through tactile glances, in: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, 2013, pp. 3180-3186.

[8]

Z. Yi, R. Calandra, F. Veiga, H. van Hoof, T. Hermans, Y. Zhang, J. Peters, Active tactile object exploration with gaussian processes, in: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, IEEE, 2016, pp. 4925-4930.

[9]

N. Jamali, C. Ciliberto, L. Rosasco, L. Natale, Active perception: Building objects’ models using tactile exploration, in: 2016 IEEE-RAS 16th Inter-national Conference on Humanoid Robots (Humanoids), IEEE, 2016, pp. 179-185.

[10]

K. Shimonomura, Tactile image sensors employing camera: A review, Sensors 19 (18) (2019) 3933.

[11]

S. Zhang, Z. Chen, Y. Gao, W. Wan, J. Shan, H. Xue, F. Sun, Y. Yang, B. Fang, Hardware technology of vision-based tactile sensor: A review, IEEE Sens. J. (2022).

[12]

A.C. Abad, A. Ranasinghe, Visuotactile sensors with emphasis on gelsight sensor: A review, IEEE Sens. J. 20 (14) (2020) 7628-7638.

[13]

S. Wang, J. Wu, X. Sun, W. Yuan, E.H. Adelson, 3D shape perception from monocular vision, touch, and shape priors, in: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, 2018.

[14]

S. Suresh, Z. Si, J.G. Mangelson, W. Yuan, M. Kaess, ShapeMap 3-D: Efficient shape mapping through dense touch and vision, in: 2022 International Conference on Robotics and Automation, ICRA, IEEE, 2022, pp. 7073-7080.

[15]

J. Hu, S. Cui, S. Wang, C. Zhang, R. Wang, L. Chen, Y. Li, GelStereo palm: A novel curved visuotactile sensor for 3D geometry sensing, IEEE Trans. Ind. Inform. (2023).

[16]

S. Cui, R. Wang, J. Hu, J. Wei, S. Wang, Z. Lou, In-hand object localization using a novel high-resolution visuotactile sensor, IEEE Trans. Ind. Electron. 69 (6) (2021) 6015-6025.

[17]

A. Yamaguchi, C.G. Atkeson, Combining finger vision and optical tactile sensing: Reducing and handling errors while cutting vegetables, in: 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), IEEE, 2016, pp. 1045-1051.

[18]

C. Sferrazza, R. D’Andrea, Design, motivation and evaluation of a full-resolution optical tactile sensor, Sensors 19 (4) (2019) 928.

[19]

E. Roberge, G. Fornes, J.-P. Roberge,StereoTac: a novel visuotactile sensor that combines tactile sensing with 3D vision, 2023, arXiv preprint arXiv: 2303.06542.

[20]

A. Alspach, K. Hashimoto, N. Kuppuswamy, R. Tedrake, Soft-bubble: A highly compliant dense geometry tactile sensor for robot manipulation, in: 2019 2nd IEEE International Conference on Soft Robotics, RoboSoft, IEEE, 2019, pp. 597-604.

[21]

B.R. Romero, Soft round, high resolution tactile fingertip sensors for dexterous robotic manipulation (Ph.D. thesis), Massachusetts Institute of Technology, 2022.

[22]

W.K. Do, M. Kennedy, DenseTact: Optical tactile sensor for dense shape reconstruction, in: 2022 International Conference on Robotics and Automation, ICRA, IEEE, 2022, pp. 6188-6194.

[23]

Z. Lin, J. Zhuang, Y. Li, X. Wu, S. Luo, D.F. Gomes, F. Huang, Z. Yang, GelFinger: A novel visual-tactile sensor with multi-angle tactile image stitching, IEEE Robot. Autom. Lett. (2023).

[24]

C.K. Williams, C.E. Rasmussen, Gaussian Processes for Machine Learning vol. 2, (3), MIT press Cambridge, MA, 2006.

[25]

Z. Zhang, A flexible new technique for camera calibration, IEEE Trans. Pattern Anal. Mach. Intell. 22 (11) (2000) 1330-1334.

[26]

C. Zhang, S. Cui, S. Wang, J. Hu, Y. Huangfu, B. Zhang, High-precision 3D reconstruction study with emphasis on refractive calibration of GelStereo-type sensors, Sensors 23 (5) (2023) 2675.

[27]

T. Wu, L. Pan, J. Zhang, T. Wang, Z. Liu, D. Lin, Density-aware chamfer distance as a comprehensive metric for point cloud completion, 2021, arXiv preprint arXiv:2111.12702.

AI Summary AI Mindmap
PDF (1992KB)

187

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/