Sensors

Conventional sensors: dumb and wasteful

Conventional electronic sensors (e.g. cameras, microphones, or pressure detectors) collect data continuously and indiscriminately at high resolution. They generate huge amounts of redundant or unimportant information, placing an enormous burden on processors and consuming excessive power.

Neuromorphic sensors: naturally smart and efficient

To address these problems – and to answer fascinating questions about how our sensory organs work (and why sometimes they don’t, and what can be done about that) – ICNS researchers create electronic sensors that mimic the smart, robust, and efficient sensors of living organisms. Biological sensors (e.g. eyes, ears, and skin) achieve amazing efficiency in two main ways:

  1. They reduce data load by capturing only important information at high resolution: they focus on any change in their surroundings that requires a response. For example, background chatter in a café can be safely ignored, but the fire alarm demands attention and action.
  2. They transmit maximum information in minimum time through efficient coding. When sensory neurons (nerve cells) are stimulated, they send information to the brain or other parts of the nervous system as spikes of electrical voltage. The spikes vary on a millisecond timescale. Both the rate and the precise timing of spikes can carry information. The pattern of spikes over time creates a temporal code that carries large amounts of sensory information and an in-built time-stamp with great efficiency.

In similar ways, our neuromorphic sensors achieve high performance with low power consumption for positive impact. ICNS researchers are working on neuromorphic systems in three critical sensory domains: visual, auditory, and tactile perception.

Learn more by clicking on the headings below.

Visual Perception

If you use a conventional video camera, recording at 30-60 frames per second, to film a rapid event like a car crash, you’ll miss important parts of the action that happen between frames, while capturing lots of unimportant background detail; you’re simultaneously under- and over-sampling. For computer analysis, your data will be inadequate, while also overwhelming. Conventional engineers solve this problem by increasing the camera’s frame-rate and using power-hungry processors to analyse the huge amount of visual data captured. This approach is wasteful, slow, and especially impractical for mobile devices, robots or vehicles. Mobile applications have similar constraints to living organisms, so neuromorphic engineers solve the problem of capturing important information by mimicking biological vision systems.

Inspired by biology, ICNS engineers design silicon retinas (artificial eyes) with built-in intelligence: in response to changes in the light it receives, each neuron (one per pixel) adjusts its sampling, producing no data when nothing is happening. In the car crash scenario, neuromorphic sensors will capture only the essential, changing or moving parts of the scene, ignoring unchanging, unimportant background detail. The event-driven, sparse data transmission of neuromorphic vision sensors enables fast, efficient processing, making them ideal for dynamic, real-world, low-energy, bandwidth-constrained applications, such as autonomous drones or cars, or satellite tracking in space. Another application is electronic retinal implants to restore sight to people whose vision has been lost to injury or disease.

Partners

Related videos

Auditory Perception

Unique to mammals, the cochlea (Greek for ‘snail’) is a spiral cavity in the bony labyrinth of the inner ear. Sensory neurons lining the cochlea have hair-like structures that vibrate when hit by soundwaves. The neurons translate the vibrations into spiking electrical signals that are relayed to the brain. The architecture of the cochlea amplifies and separates audio frequencies while preserving their precise timing, which is essential for calculating sound location. The human hearing system can interpret complex auditory environments comprising multiple sound sources and echoes, prioritising important information. For example, at a party, you can attend to one group’s conversation while tuning out many other conversations. Sound engineers struggle to do the same with conventional microphones and audio processing techniques.

Artificial cochleas have been implanted in humans since the 1970s. They use a microphone to digitise sounds. A processor filters the digitised sounds to prioritise audible speech. A radio transmitter sends the processor’s output to a receiver, which converts the radio signals to electrical impulses, which stimulate the cochlear nerve to signal the brain. Hearing and speech understanding show wide variation across recipients of conventional cochlea implants. At ICNS, inspired by nature, we develop artificial cochleas that replicate the superior performance and efficiency of biological auditory systems in complex environments. In the process, we gain new insights into human hearing.

Partners

Related videos

Tactile Perception

mechanoreceptor is a sensory neuron that responds to mechanical pressure or distortion. There are several types of mechanoreceptors in human skin, enabling us to sense light or heavy pressure, vibrations, rough or smooth texture, shapes and edges, skin stretch, grip and slippage. There are also mechanoreceptors in our muscles and joints. We consciously or reflexively adjust our body position and movements in response to feedback from our mechanoreceptors, sent via the neural network of our somatosensory system.

To better understand how neurons transmit sensory information and motor (movement) control signals, ICNS researchers study people with fully functioning somatosensory systems. We decode the spiking voltage signals recorded from their mechanoreceptors, mapping each response to the sensation. Based on our study of human subjects, we are developing an artificial, neuromorphic, tactile sensory system on which we can perform further experiments to gain more insights. We aim to develop closed-loop control schemes connecting neuroprosthetic devices to the human nervous system, returning tactile perception and motor control to people who have lost them through injury or disease.

Partners

  • University of Sheffield
  • Bristol Robotics Laboratory

Olfactory Perception

This area covers the sense of smell, and chemical sensing, as inspired by biology. We simulate the architecture of the olfactory system, which can be used to solve many data classification problems. We design spiking networks for data classification inspired by computing principles found in the olfactory system, and build dedicated, power-efficient hardware to run these networks in real time.

Partners