How does our brain process visual information? The neuroscience of vision is among the most intensely studied fields in science. Yet due to the vast complexity of the visual system and our technical limitations in probing the living human brain, we are still unable to definitively answer many of the most basic questions around how the visual system operates. The field of Neuromorphic Engineering turns the problem of understanding the brain on its head. Instead of attempting to study the vast array of electrical and chemical pathways in the brain in minute phenomenological detail, Neuromorphic Engineers attempt to replicate the superior performance of its systems using the same power, speed, accuracy, and structural constraints, with the assumption that this approach significantly constrains the solution space and that any developed solutions meeting the same requirements are likely to have similar functional properties to those used in the brain. Such solutions are likely to provide deep insights into how the brain processes information.
Neuromorphic engineers have developed a range of biologically inspired sensors that perceive the world in a similar manner as the human eye. One such sensor called the Dynamic Vision Sensor or Silicon Retina operates entirely differently to a normal camera. The DVS operates in an event-based manner, that is without sequentially capturing frames via a global shutter. Instead, the event-based DVS aims to model the photo-receptor circuits present in the retina by having each pixel of operating independently and generating spikes (events) only in response to detected changes in the visual scene.
As every good engineer knows, one of the best ways to understand how a system works is to investigate its failure modes. The investigation and replication of modes of failure, called Forensic Engineering, allows us to understand how the design of a complex system when interacting with a complex environment can result in unexpected outputs. By studying the edge cases where the biological visual system responds in an unexpected or incorrect manner, we can gain deep insights into its internal functioning.
Event-based sensors such as the DVS allow us to perform a wide range of experiments in controlled as well as natural environments. By capturing stimuli that cause optical illusions in the same event-based way as the human eye we can attempt to design and investigate processing networks that reproduce the same effects. This would provide us with real-time working models that model the key functional pathways that cause the illusions. Once developed these working “faulty” visual processing systems can then be tested in the type of natural cluttered visual environments where their biological models originally evolved and have their performance tested on the types of visual detection, recognition and tracking tasks that made the difference between life and death. In this way real, working, neuromorphic, event-based sensor-processor systems operating in the real-world may allow us to not only investigate optical illusions but to also gain insights into why they evolved in the first place and what functions they serve.