- ICNS Home
- Projects
- Open PhD Projects
- Environmental Situational Awareness using Neuromorphic Vision Sensors and IMU-based SLAM
Environmental Situational Awareness using Neuromorphic Vision Sensors and IMU-based SLAM
Supervisors:
Primary supervisor Dr Saeed Afshar
Description:
Human eyes perform several rapid jerk-like movement of the eyeball called saccadic motions every second. The saccadic motion of the eye generates large shifts in the visual field. Yet humans and many other animals can reduce their visual perception’s sensitivity to the sudden displacement of visual stimuli using a process called Saccadic Suppression of Displacement (SSD). The biological visual perception can generate a stable visual perception even with the ego motion of the body and head. The working principle of biological SSD is still an open question, but many modern signal processing approaches have been applied to the perform Simultaneous Localisation and Mapping (SLAM) using conventional vision sensors. The visual SLAM-based methods can efficiently calculate the ego motion of the visual sensor and localise itself in any given environment. This allows us to build a stable projection of the visual stimuli on to the 3D perception of the surrounding environment. The neuromorphic sensors can capture visual stimuli with higher temporal precision and dynamic range than the conventional cameras. Visual SLAM using neuromorphic vision sensors is an active area of research and has been successfully applied to perform localisation and mapping of robots in an environment. In this project, we plan to combine the visual SLAM methods with active saccadic motions tracked by IMU sensor inside the sensors to generate a stable representation of the sensor surroundings, and actively monitor the environment for changes in the surroundings. Building on top of space situational awareness methods built previously at ICNS to perform star mapping, we will be extending it to challenging environments with complex backgrounds.
Figure2: Indoor Visual SLAM using conventional image sensors. Reference.
Outcomes:
This project will investigate neuromorphic vision sensors for Simultaneous Localisation and Mapping of the surrounding environment. The project will involve developing algorithms to process data from an actively moving event-based camera with motion sensors to generate a map of the surroundings and simultaneously detect changes in the environment.
- Develop an algorithm to map event-based data and location of the sensor using IMU data. This would involve tracking the trajectory and ego-motion of the sensor and estimate the shifts in the visual receptive field.
- Investigate event-based data representations for event stimuli generated by an object that are invariant to the motion of the sensor and detect changes in the environment by matching the representations.
- A working prototype of the setup to map real world scenarios. Investigation of the effects of occlusions and changing lighting conditions in the environment.
- Future work could involve using the generated map for tasks that depend on changes in the surrounding environment and investigation of generating event data at the 3D environment level.
Eligibility criteria:
Experience with C++, Python, MATLAB, or other equivalent languages for developing and testing the algorithms. Experience with computer vision algorithms and strong mathematical background.
About ICNS
Partnership
Research
Projects
-
- Task-Driven Model Evaluation in Large-Scale Spiking Neural Networks
- A Neuromorphic Ferroelectric field-effect Ultra-Scaled Chip for Spiking Neural Networks
- Event Based Wavefront Sensing Modalities
- Physics-Based Encoding for Spiking Neural Networks
- Neuromorphic Computational Imaging
- Defining Performance Metrics for Closed Loop Event Based Imaging Systems
- A Neuromorphic Framework for Event-Based DNNs using Minifloats
- A RISC-V instruction set architecture (ISA) extensions for neuromorphic computing using minifloats
- Astrometry with Event-based Vision Sensors
- Automatic Evaluation of Bushfire Risk via Acoustic Scene Analysis
- Bio-inspired Sensors for Space Situational Awareness
- Building a Neuromorphic Auditory Pathway for Sensing the Surrounding Environment
- Cold Start Astrometry for High-Precision Airspace and Space Objects Tracking with Neuromorphic Cameras
- Control Systems Inspired by Insect Central Pattern Generators that can Adapt to Dynamic Environments.
- Design of Neuromorphic Spiking Neural Networks for Real-Time Processing
- Enhanced Maritime Situational Awareness with Neuromorphic Cameras
- Environmental Situational Awareness using Neuromorphic Vision Sensors and IMU-based SLAM
- Fault Tolerant Distributed Swarm Intelligence using Neuromorphic Computing and Local Learning Principles
- Honey Bee Waggle Dance Detection via Neuromorphic Engineering
- Integrated Circuit Design for Event-based Vision Sensors
- Low-Power Acoustic Ecological Monitoring in Remote Areas using Machine Learning and Neuromorphic Engineering
- Neuromorphic Computing in Extreme Environments
- Neuromorphic Cyber Security at the Edge
- Neuromorphic Engineering for Acoustic Aerial Drone Detection in Visually Obscured Environments
- Machine Learning-Based Tool for Therapists to Monitor Speech Progress in Late Talkers
- Machine Learning for Automated Child Reading Assessment and Intervention
- Underwater Acoustic Drone Detection via Neuromorphic Models of Marine Mammal Audition
Resources
Master's Program
Opportunities
Events
News
Contact