Simple Eyelike Sensors Could Make AI Systems More Efficient
(Inside Science) -- A sensor that mimics how the human eye detects light could lead to better vision for autonomous robots and self-driving cars, a new study argues.
Modern electronic cameras are based on sensors that generate an electrical signal whenever light falls on them. In contrast, the roughly 100 million rod and cone cells of the retina -- the light-sensitive layer in the back of the eye -- only transmit signals to the brain in response to a change in light. This makes human eyes significantly more efficient than electronics in terms of both energy and computing power, explained study senior author John Labram, a device physicist at Oregon State University in Corvallis.
Scientists have previously created sensors that imitate the retina. However, these "retinomorphic" electronics involve complex circuits ill-suited for use in mass-produced sensors.
Now Labram and his colleagues have replaced these intricate circuits with a simpler alternative -- light-sensitive materials known as perovskites, which are currently being developed for use in next-generation solar power cells. "Without this recent material breakthrough, we would not have been able to make our sensor," he said.
The core of the new sensor consists of an electrically insulating glass layer coated with the perovskite methylammonium lead iodide. When placed in light, this perovskite changes from highly electrically insulating to highly electrically conducting. The researchers sandwiched these layers between electrodes, and found this sensor had a strong electrical response to light but generated no further signals until the lighting changed.
"This is the first single-pixel sensor that replicates the behavior of biological retina as part of its fundamental design," Labram said. "From an industrial point of view, this could eventually have a huge impact on speed and power consumption."
The sensors could find use in applications involving rapid processing of images, including lidar, facial recognition and autonomous vehicles, said materials scientist Thomas Anthopoulos at the King Abdullah University of Science and Technology in Saudi Arabia, who did not take part in this research. "The key element of this technology is its simplicity and ability to be integrated with a range of emerging electronics, such as wearable systems, transparent displays and micro displays."
The scientists are now aiming to develop an array of these sensors to record actual visual data, "starting with 10-by-10 resolution," Labram said. They also want to connect their retinomorphic sensors with artificial intelligence systems to "better replicate the way that biological systems process stimuli," he said.
All in all, this work "is part of a larger ongoing effort to make computers more humanlike generally," Labram noted. Traditional computers are designed to carry out computations as sequences of steps, but increasingly scientists are developing so-called neuromorphic processors that are designed to mimic the human brain by performing many computations simultaneously in parallel, he explained.
Just as retinomorphic sensors might prove better than conventional optical sensors, so too might neuromorphic computers one day prove significantly more efficient than conventional computers.
"The human brain consumes around 20 watts of power, and a home PC [personal computer] runs at around 100 watts," Labram said. "This doesn't sound like much, but a single PC cannot do a similar task to the human brain -- for example, real-time learning. This sort of task would require a data center rather than a PC to achieve."
The scientists detailed their findings in a study published in the journal Applied Physics Letters.
Editor's Note: Applied Physics Letters is published by AIP Publishing, which is a is a wholly owned subsidiary of the American Institute of Physics. Inside Science is an editorially independent science news service run out of the American Institute of Physics.