Self Organising Maps on Flying Drone
Development of event-based Self Organising Maps from data recorded on a flying drone.
This video was recorded with a drone flying in Telluride CO. The raw events from the sensor are shown on an exponentially decaying time surface (bottom left), with ON events in yellow and OFF events in blue. An event-based Self Organising Map of feature detectors was trained on the data using our FEAST algorithm. The top left shows the feature detectors after training. When a new event is received from the camera, it is added to the time-surface and a local region (11x11 pixels) of the time-surface around the event is sent to the feature detectors. The activation of these features is shown on a color-coded surface (right), illustrating the segregation of the of the event stream into a smoothly transitioning feature space. This process is illustrated in the figure below.
Resources
-
- Introduction to the International Centre for Neuromorphic Systems (ICNS)
- Full Lunar Eclipse through a Neuromorphic Event-based Camera
- Rocket launch from a Neuromorphic Camera - Events vs. Frames
- Research Week 2020 Astrosite
- Research Week 2020 Neuromorphic Engineering
- Fusion of Multiple Sensors
- Simultaneous Sky Mapping and Satellite Tracking
- Image Segregation
- Self Organising Maps on Flying Drone
- Seeing the Moon
- Modelling the Human Auditory System