Paper ID: 2203.16988

Acoustic-Net: A Novel Neural Network for Sound Localization and Quantification

Guanxing Zhou, Hao Liang, Xinghao Ding, Yue Huang, Xiaotong Tu, Saqlain Abbas

Acoustic source localization has been applied in different fields, such as aeronautics and ocean science, generally using multiple microphones array data to reconstruct the source location. However, the model-based beamforming methods fail to achieve the high-resolution of conventional beamforming maps. Deep neural networks are also appropriate to locate the sound source, but in general, these methods with complex network structures are hard to be recognized by hardware. In this paper, a novel neural network, termed the Acoustic-Net, is proposed to locate and quantify the sound source simply using the original signals. The experiments demonstrate that the proposed method significantly improves the accuracy of sound source prediction and the computing speed, which may generalize well to real data. The code and trained models are available at https://github.com/JoaquinChou/Acoustic-Net.

Submitted: Mar 31, 2022