Biology-Inspired Cameras on the International Space Station

The first neuromorphic devices to reach space are a technology test case. They will record ‘space lightning’

Western’s neuromorphic cameras will be installed on the International Space Station’s Columbus research module for at least a year from late 2021.

In late 2021, the International Space Station (ISS) will receive a pair of ‘neuromorphic’ cameras designed by Western’s Falcon Neuro project to document upper-atmosphere sprites and keep an eye out for space debris.

A bit like lightning, sprites are spectacular electric discharges. But these streak out of the top of storms towards the edge of the atmosphere. “Modelling of their electron interactions has shown they might be generating greenhouse gases and disrupting satellite communication systems,” says Associate Professor Greg Cohen from Western’s International Centre for Neuromorphic Systems (ICNS). “Yet they live such brief lives that they are extremely difficult to study, and poorly understood.” 

Due to be launched in November, Falcon Neuro’s cameras will be part of a larger palette of experiments that will be fixed to the outside of the ISS Columbus module (see left) by the station’s robotic arm. There, a year-long observation will address a fundamental lack of data on sprites and keep an eye out for space junk. But perhaps as importantly, it will mark the first use in space of neuromorphic engineering — designs that seek to replicate the efficiencies of biology.

THE EYES HAVE IT

Western’s neuromorphic cameras mimic the way that photoreceptors in our eyes only send signals to the brain when activated by light, cutting out vast quantities of non-essential information. The hope is that they will bring much needed cost, power and data efficiencies to space research. 

For space-based applications, where power and data transmission are at a premium, “the amount of data conventional cameras produced is just way more than you could ever stand to bring down from the space station,” explains Matthew McHarg, a research director at the United States Air Force Academy (USAFA) in Colorado Springs, who has been working with Neuro Falcon to ensure the cameras can reach and survive space conditions.

Researchers have previously studied sprites using high-speed digital cameras on satellites or aeroplanes. These cameras contain an array of semiconductor chips that convert light into electrical signals, which are periodically gathered together to create a single image frame. “But if you’re looking at the sky and nothing’s happening, a normal camera produces whole frames of information that tell you nothing,” says Cohen. High-speed, high resolution cameras can produce up to 100,000 frames per second and a vast quantity of useless data.

An array of pixels have been fitted to Neuro Falcon’s cameras, with each pixel wired to operate like an individual camera, only delivering data when there is a change in light intensity falling upon it. “If something moves or suddenly gets brighter or darker — like a lightning strike, for example — those pixels tell you that something’s changed,” Cohen explains. 

Falcon Neuro’s cameras are each about the size of a conventional single lens reflex camera and feature far fewer pixels than a smartphone camera, but they are sufficient to capture sprites. Their primary task is to map exactly where sprites occur, how often they are formed, and what might trigger them. This may also help us to understand more about the origins of conventional lightning, which could feed into weather and climate models. 

Each pixel in Cohen’s neuromorphic camera has dedicated circuitry to process its signals, so that it can immediately report changes in light and time-stamp them with microsecond precision. 

“And if you reduce the amount of data, you don’t just reduce the power draw of the camera, you reduce the amount of power in the processing and the transmission of data back to Earth,” explains Cohen. 

Elsewhere, neurons are also the inspiration behind highly efficient, low power processors, which are attracting significant early innovation investment from NASA, Boeing, Intel, IBM, General Motors and others.  

Neuromorphic sensor pixels are wired to take readings only when light falls on them. 

Need to know

  • In late 2021 two cameras designed by Western’s Falcon Neuro team will become the first neuromorphic technology in space. 
  • Neuro Falcon’s cameras mimic the way that photoreceptors in our eyes only send signals to the brain when activated by light.
  • Neuromorphic technology can reduce the amount of empty data captured. 

SPACE TO GROW

Associate Professor Greg Cohen from Western’s International Centre for Neuromorphic Systems (ICNS).

McHarg runs a research team of cadets at USAFA’s Space Physics and Atmospheric Research Centre who build science payloads destined for space. “We didn’t really know much about neuromorphic cameras,” explains McHarg “So, I was very excited to team up with Greg on this.” One goal, he says, is simply to find out how the cameras operate in the harsh space environment. “We hope that this is the start of a whole bunch of uses of these cameras in space,” he adds.

“It’s a ludicrously expensive business, launching and installing equipment on the ISS,” Cohen points out, recalling that he offered to travel to the ISS himself to fit the cameras. But NASA “very kindly rejected my offer,” he says, laughing. “I said to them once, ‘What if I installed a thumbprint reader, so that you needed me to activate them?’ And they said, ‘We’ll absolutely send your thumb up for you’.”

Western’s neuromorphic first has been made possible through a $5.4 million contract from Australia’s Defence Innovation Hub, awarded in 2019, to develop neuromorphic cameras for use beyond Earth’s atmosphere, the largest ever contract awarded by the funder. 

EYE OUT FOR SPACE CLUTTER

One of Western’s neuromorphic cameras will also keep watch for satellites and fragments of space junk. This is part of a wider ‘space situational awareness’ campaign to track the positions of thousands of objects that increasingly congest the edge of space. 

When satellites reach the end of their lives, some are simply abandoned. This growing cloud of space junk risks colliding with operational satellites, potentially posing a threat to communication networks, global positioning systems and weather satellites. Travelling in orbit at 28,000 km/h, a direct hit from a fragment of debris no bigger than a CD could destroy a satellite. 

“All satellites are just inherently fragile,” says Stacie Williams, Space Science Architect at the Air Force Office of Scientific Research in Arlington, Virginia. “A small piece of debris can cause a lot of problems up there. And there’s a lot. It’s kind of like trying to monitor mosquitoes in my backyard here in Virginia in summer.”

While one of Western’s cameras will look down at the Earth, the other will stare at the Earth’s horizon. “The one looking at the horizon will be able to see the upper edge of space, and we should be able to see low Earth-orbit satellites,” says Cohen. That will allow them to gather space situational data that is vital to protect more than 4,000 satellites in orbit around the Earth.

“We need new, creative sensors to effectively, and at a reasonable cost, track the debris that we’re concerned about,” explains Williams.

 Most monitoring of debris relies on either expensive radar facilities, or optical telescopes that only function at night, and Cohen says that neuromorphic cameras could help to provide much more affordable and accurate tracking data.

The Western team has already tested neuromorphic cameras for this in a mobile observatory called Astrosite. Developed in collaboration with the Royal Australian Air Force, the Astrosite neuromorphic telescope has been able to track objects in low Earth orbit, and all the way up to the most distant geostationary orbits. 

One advantage on the ground is that while normal telescopes have to remain as still as possible, to avoid blurry images, neuromorphic cameras are so fast that they don’t have that problem. Since the cameras rely on changes in light intensity, they can also function during the day, allowing them to monitor the sky continuously. “There are just so many benefits, because they are so different from traditional cameras,” Williams points out.

Meet the Academic | Associate Professor Gregory Cohen

Gregory Cohen is an Associate Professor in Neuromorphic Systems at the International Centre for Neuromorphic Systems (ICNS) at Western Sydney University and program lead for neuromorphic algorithms and space applications.  

He received a BSc(Eng) Electrical and Computer engineering, a MSc (Eng) and a BCom(Hons) in Finance and Portfolio Management from the University of Cape Town, Cape Town, South Africa in 2007, 2008, 2010, respectively, and a joint PhD in signal processing and neuromorphic engineering from Western Sydney University, Sydney Australia and the University of Pierre and Marie Curie in Paris, France.  

Prior to returning to research from industry, he worked in several start-ups and established engineering and consulting firms including working as a consulting engineer in the field of large-scale HVAC from 2007 to 2009, as an electronic design engineer from 2009 to 2011, and as an expert consultant for Kaiser Economic Development Practice in 2012.  

Credit

© DAVID DUCROS / SCIENCE PHOTO LIBRARY © Michael Amendolia © NASA/unsplash
Future-Makers is published for Western Sydney University by Nature Research Custom Media, part of Springer Nature.