Underwater Collision Avoidance

Project Title

Neuromorphic Audio-Visual Scene Analysis for Underwater Collision Avoidance

Project TimelineOctober 2021 - June 2023
ResearchersDr Saeed Afshar, Dr Ying Xu
Partners/Collaborators 

Project Synopsis

To develop a proof-of-concept for a neuromorphic underwater audio-visual collision avoidance system for the autonomous or un-crewed underwater vehicles currently being developed by the Royal Australian Navy.

Project Details

In this project, we propose to design and build a neuromorphic audio-visual sensor-processor system that can be fitted onto a UUV such as the DSTG Remus for detection and ranging that enables ship strike avoidance. The neuromorphic system will integrate data from current state-of-the-art event-based vision sensors and our in-house developed event-based cochlea systems to perform real-time power-efficient audio and visual scene analysis. In the audio domain, with input from partners in DSTG maritime division, we will demonstrate the potential for neuromorphic signal processing to determine the range and bearing of sound sources using a multi-hydrophone array more efficiently and robustly than standard algorithms. In the visual domain, we will demonstrate the potential for vision and particularly the high-speed and dynamic range of event-based Dynamic Vision Sensors (DVS)above and below the water surface to augment auditory scene analysis, especially over shorter distances. For our exploration of multi-modal underwater vision and audition, we will design and build underwater housing for event-based sensors and integrate these sensors temporally with the auditory sensing system. The DVS used will be the highly sensitive fourth generation Prophesee cameras which our lab already has significant experience with. The vision sensor will be mounted using a fisheye lens that will be able to image up through the water column.