1. ICNS Home
  2. Projects
  3. Open PhD Projects
  4. Environmental Situational Awareness using Neuromorphic Vision Sensors and IMU-based SLAM

Environmental Situational Awareness using Neuromorphic Vision Sensors and IMU-based SLAM

Supervisors:

Primary supervisor Dr Saeed Afshar

Description:

Human eyes perform several rapid jerk-like movement of the eyeball called saccadic motions every second. The saccadic motion of the eye generates large shifts in the visual field. Yet humans and many other animals can reduce their visual perception’s sensitivity to the sudden displacement of visual stimuli using a process called Saccadic Suppression of Displacement (SSD). The biological visual perception can generate a stable visual perception even with the ego motion of the body and head. The working principle of biological SSD is still an open question, but many modern signal processing approaches have been applied to the perform Simultaneous Localisation and Mapping (SLAM) using conventional vision sensors. The visual SLAM-based methods can efficiently calculate the ego motion of the visual sensor and localise itself in any given environment. This allows us to build a stable projection of the visual stimuli on to the 3D perception of the surrounding environment. The neuromorphic sensors can capture visual stimuli with higher temporal precision and dynamic range than the conventional cameras. Visual SLAM using neuromorphic vision sensors is an active area of research and has been successfully applied to perform localisation and mapping of robots in an environment. In this project, we plan to combine the visual SLAM methods with active saccadic motions tracked by IMU sensor inside the sensors to generate a stable representation of the sensor surroundings, and actively monitor the environment for changes in the surroundings. Building on top of space situational awareness methods built previously at ICNS to perform star mapping, we will be extending it to challenging environments with complex backgrounds.

Figure2: Indoor Visual SLAM using conventional image sensors. Reference.

Outcomes:

This project will investigate neuromorphic vision sensors for Simultaneous Localisation and Mapping of the surrounding environment. The project will involve developing algorithms to process data from an actively moving event-based camera with motion sensors to generate a map of the surroundings and simultaneously detect changes in the environment.

Eligibility criteria:

Experience with C++, Python, MATLAB, or other equivalent languages for developing and testing the algorithms. Experience with computer vision algorithms and strong mathematical background.

About ICNS

Partnership

Research

Projects

Resources

Master's Program

Opportunities

Events

News

Contact