Acoustic Self-Awareness of Autonomous Systems in a World of Sounds

2020 
Autonomous systems (ASs) operating in real-world environments are exposed to a plurality and diversity of sounds that carry a wealth of information for perception in cognitive dynamic systems. While the importance of the acoustic modality for humans as ``ASs'' is obvious, it is investigated to what extent current technical ASs operating in scenarios filled with airborne sound exploit their potential for supporting self-awareness. As a first step, the state of the art of relevant generic techniques for acoustic scene analysis (ASA) is reviewed, i.e., source localization and the various facets of signal enhancement, including spatial filtering, source separation, noise suppression, dereverberation, and echo cancellation. Then, a comprehensive overview of current techniques for ego-noise suppression, as a specific additional challenge for ASs, is presented. Not only generic methods for robust source localization and signal extraction but also specific models and estimation methods for ego-noise based on various learning techniques are discussed. Finally, active sensing is considered with its unique potential for ASA and, thus, for supporting self-awareness of ASs. Therefore, recent techniques for binaural listening exploiting head motion, for active localization and exploration, and for active signal enhancement are presented, with humanoid robots as typical platforms. Underlining the multimodal nature of self-awareness, links to other modalities and nonacoustic reference information are pointed out where appropriate.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    215
    References
    3
    Citations
    NaN
    KQI
    []