- ICNS Home
- Projects
- Open PhD Projects
- Astrometry with Event-based Vision Sensors
Astrometry with Event-based Vision Sensors
Supervisors:
Primary supervisor Prof André van Schaik
Description:
ICNS are world leaders using event-based vision sensors for space applications, such as our sensors on the International Space Station or our Astrosite to track satellites and space junk. For a number of such applications it is useful to be able to identify where the sensor is pointing in space from the pattern of the stars observed. While this problem has been well studied for conventional cameras, it is yet unknown how to do this optimally when observing with event-based vision sensors. This project will investigate existing method for use with event-based data, as well as develop novel methods specifically for event-based data.
Eligibility criteria:
Expertise in Python, or C++ for software simulation. Good skills in algorithm development.
About ICNS
Partnership
Research
Projects
-
- Task-Driven Model Evaluation in Large-Scale Spiking Neural Networks
- A Neuromorphic Ferroelectric field-effect Ultra-Scaled Chip for Spiking Neural Networks
- Event Based Wavefront Sensing Modalities
- Physics-Based Encoding for Spiking Neural Networks
- Neuromorphic Computational Imaging
- Defining Performance Metrics for Closed Loop Event Based Imaging Systems
- A Neuromorphic Framework for Event-Based DNNs using Minifloats
- A RISC-V instruction set architecture (ISA) extensions for neuromorphic computing using minifloats
- Astrometry with Event-based Vision Sensors
- Automatic Evaluation of Bushfire Risk via Acoustic Scene Analysis
- Bio-inspired Sensors for Space Situational Awareness
- Building a Neuromorphic Auditory Pathway for Sensing the Surrounding Environment
- Cold Start Astrometry for High-Precision Airspace and Space Objects Tracking with Neuromorphic Cameras
- Control Systems Inspired by Insect Central Pattern Generators that can Adapt to Dynamic Environments.
- Design of Neuromorphic Spiking Neural Networks for Real-Time Processing
- Enhanced Maritime Situational Awareness with Neuromorphic Cameras
- Environmental Situational Awareness using Neuromorphic Vision Sensors and IMU-based SLAM
- Fault Tolerant Distributed Swarm Intelligence using Neuromorphic Computing and Local Learning Principles
- Honey Bee Waggle Dance Detection via Neuromorphic Engineering
- Integrated Circuit Design for Event-based Vision Sensors
- Low-Power Acoustic Ecological Monitoring in Remote Areas using Machine Learning and Neuromorphic Engineering
- Neuromorphic Computing in Extreme Environments
- Neuromorphic Cyber Security at the Edge
- Neuromorphic Engineering for Acoustic Aerial Drone Detection in Visually Obscured Environments
- Machine Learning-Based Tool for Therapists to Monitor Speech Progress in Late Talkers
- Machine Learning for Automated Child Reading Assessment and Intervention
- Underwater Acoustic Drone Detection via Neuromorphic Models of Marine Mammal Audition
Resources
Master's Program
Opportunities
Events
News
Contact