Underwater object detection by fusing features from different representations of sonar data

Fei WANG, Wanyu LI, Miao LIU, Jingchun ZHOU, Weishi ZHANG

PDF(19311 KB)
PDF(19311 KB)
Front. Inform. Technol. Electron. Eng ›› 2023, Vol. 24 ›› Issue (6) : 828-843. DOI: 10.1631/FITEE.2200429
Orginal Article
Orginal Article

Underwater object detection by fusing features from different representations of sonar data

Author information +
History +

Abstract

Modern underwater object detection methods recognize objects from sonar data based on their geometric shapes. However, the distortion of objects during data acquisition and representation is seldom considered. In this paper, we present a detailed summary of representations for sonar data and a concrete analysis of the geometric characteristics of different data representations. Based on this, a feature fusion framework is proposed to fully use the intensity features extracted from the polar image representation and the geometric features learned from the point cloud representation of sonar data. Three feature fusion strategies are presented to investigate the impact of feature fusion on different components of the detection pipeline. In addition, the fusion strategies can be easily integrated into other detectors, such as the You Only Look Once (YOLO) series. The effectiveness of our proposed framework and feature fusion strategies is demonstrated on a public sonar dataset captured in real-world underwater environments. Experimental results show that our method benefits both the region proposal and the object classification modules in the detectors.

Keywords

Underwater object detection / Sonar data representation / Feature fusion

Cite this article

Download citation ▾
Fei WANG, Wanyu LI, Miao LIU, Jingchun ZHOU, Weishi ZHANG. Underwater object detection by fusing features from different representations of sonar data. Front. Inform. Technol. Electron. Eng, 2023, 24(6): 828‒843 https://doi.org/10.1631/FITEE.2200429

RIGHTS & PERMISSIONS

2023 Zhejiang University Press
PDF(19311 KB)

Accesses

Citations

Detail

Sections
Recommended

/