Neuromorphic Perception

Neuromorphic sensors emulate the perceptual power of biological sensors via two critical principles:

  • First, neuromorphic sensors efficiently encode sensory information into time itself. By converting a highly detailed sensory environment into spikes, bio-inspired sensors use the relative timing of spikes and/or their rate to encode a wide range of information with arbitrary precision, reliability, speed or bandwidth depending on the application requirements.
  • The second way neuromorphic sensors emulate the efficiency of biology is through focusing on encoding change. While conventional man-made sensors seek to continuously and reliably encode the absolute value of their inputs at the maximum possible resolution, regardless of how much data is generated, biological sensors cannot afford to waste their sensory and processing hardware on useless, redundant information. Since living things live and die based on how quickly and energy efficiently they respond to their unpredictable environment, their sensors are focussed only on change, because sensory change is the signal for initiation or modification of behaviour. Similarly, by only encoding sensory change, bio-inspired sensors drastically reduce the sensory information reported to higher levels of processing and keep the processing system focussed on responding quickly to behaviourally relevant stimuli at the expense of sensory fidelity and resolution.

At ICNS, we are active in the four main sensory domains of bio-inspired perception: visual, audio, smell, and tactile.

Visual Perception

Biological vision systems are remarkably efficient, robust, and capable of operating over an incredible range of light intensities. Neuromorphic event-based cameras, or silicon retinas, seek to replicate these advantages by operating in the same event-based independent pixel domain. At ICNS we focus on the development and use of silicon retinas for dynamic, real world, low-power, bandwidth-constrained applications where neuromorphic sensing has an in-built natural advantage over other more established imaging technologies. Applications such as drone sensors and mobile environmental sensors for remote locations inherently share the same energy, information, and speed constraints as those shared by biological organisms, which must respond rapidly, robustly, and efficiently to a noisy ambiguous environment in order to survive.

See also our work on space situational awareness with our Astrosite mobile observatory.

Video Highlights


Auditory Perception

Humans and other mammals process their auditory environment using a remarkable neuro-mechanical sensor called the cochlea. Derived from the ancient Greek word for snail shell, the compact spiral bony structure ingeniously converts pressure waves in the air into vibrations on a semi rigid elastic membrane and into neural spike signals, amplifying and segregating the audio frequency components along the way, while preserving the high precision temporal information that is critical to sound localisation and auditory scene analysis. This simultaneous spectro-temporal segregation of auditory data is a challenging and computationally expensive processing stage in auditory processing yet is performed naturally by the physical geometry of the cochlea. At ICNS, by developing efficient hardware implementations of artificial cochleas inspired by biology, we are able to replicate the remarkable performance and efficiency of biological auditory perception, while at the same time providing many key insights into how humans perceive their auditory environment.

The human auditory system not only allows the location and recognition of individual sound sources but also enables the interpretation of complex auditory environments made with many dynamic sources and their reflections off the physical environment. Yet complex auditory scene analysis problems such as ‘the cocktail party problem’ are extremely difficult to solve for current state of the art conventional audio processing algorithms. In a cocktail party situation, a human listener can attend to one group’s conversation with ease while tuning out sounds from many other conversations. A key element of enabling this ease of segregation in the biological system is the preservation of precise temporal information in the spiking output of the human cochlea.

By operating in the same spiking, event-based domain and preserving timing information, our neuromorphic auditory sensors allow the development of superior processing techniques used in the brain for in a host of recognition, localisation, and auditory scene analysis problems that have hither-to remained in the technological too-hard basket.

The event-based output of our auditory sensors form a central foundation for our investigation of neural processing. Due to their unique informational properties and distinct real world applications, neuromorphic auditory sensors reveal entirely new neuro-computational problems and motivate novel neuro-computational solutions and principles.

Video Highlights


Tactile Perception

The tactile stream of the ICNS research focusses on decoding spiking signals recorded from human peripheral nerves to understand their utility as both sensors and control signals to neuroprosthetics devices.  Our aim is to return motor function to people using neuroprosthetics enabling greater function and improved quality of life. By utilising the intact somatosensory system of the subject as a source of sensory and control signals, we aim to develop a closed loop control scheme for interfacing the nervous system with neuroprosthetics devices.

We use microneurography to record electroneurographic signals generated from mechanoreceptors in control subjects. We learn to perform neural processing in an online manner on signals from the nervous system, using spiking neural networks, thus mapping the body’s signals to the desired motor action.

In addition, we are developing neuromorphic spiking tactile (tactomorphic) sensor system, intended to serve as a platform for further experimentation, and developed as a consequence of our initial experiments with human subjects.

Through the design and fabrication of tactomorphic sensors and the development of event-based algorithms which process their output, we gain insights into how the human body makes sense of its tactile environment.