New optical sensor imitates the human eye

The sensor imitates the retina’s response to movement — a crucial move in developing neuromorphic computing.

Researchers have created a new type of optical sensor, designed to mimic the human eye’s ability to detect changes in its visual field. The team from Oregon State University says that this is a major breakthrough in the fields of image recognition, robotics, and artificial intelligence.

Unlike traditional sensors, the human-like artificial eye is designed to work with human-like AI, a match that could unleash the full potential of breakthroughs in both hardware and software.

Mimicking the Human Brain with Neuromorphic Computing 

The new sensor could pair well with the latest neuromorphic computing technology, which is being increasingly used for AI in applications like image recognition. Instead of processing information in a sequence, as traditional computers do, processors used in AI often involve integrated electrical circuits that mimic the circuitry in the brain.

In other words, instead of a single pathway, the neuromorphic processor uses a web of pathways.

But the kind of information obtained with current optic sensors lags behind the processing potential we have with neuromorphic computing today.

“Even though the algorithms and architecture designed to process information are becoming more and more like a human brain, the information these systems receive is still decidedly designed for traditional computers,” team member and engineer John Labram said in a statement.

Most sensor technology, like chips in smartphones or digital cameras, use sequential processing: they scan and capture images pixel by pixel. But the algorithms that process visual information are becoming more complicated, especially with AI  or neuromorphic computing applications.

Mimicking the Human Eye

The idea is that as computer brains “think” more like human brains, they need to be paired with a human-like eye device — a retinomorpic sensor. With this new work, published in Applied Physics Letters, the computer eye is catching up to the capabilities of the neuromorphic computing brain. 

Most attempts at building eye-like sensors involved complex hardware or software. But what makes this new sensor unique is the use of perovskite, a mineral used in solar cells. The sensor uses ultrathin layers of perovskite semiconductors that, when placed in light, can transform from electrical insulators to conductors.

In Labram’s optic sensor, the thin layers of perovskite function like a capacitor, which varies its ability to store electric energy under illumination. The sensors react to light like the human eye does. When it senses a change in illumination, it registers a sharp signal, but then returns to its original state. 

“The way we test it is, basically, we leave it in the dark for a second, then we turn the lights on and just leave them on,” Labram said. “As soon as the light goes on, you get this big voltage spike, then the voltage quickly decays, even though the intensity of the light is constant. And that’s what we want.”

The team says this new sensor could be perfect for self-driving cars, for example, by helping robots track moving objects.

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at tips@freethink.com.

Related
The West needs more water. This Nobel winner may have the answer.
Paul Migrom has an Emmy, a Nobel, and a successful company. There’s one more big problem on the to-do list.
Can we automate science? Sam Rodriques is already doing it.
People need to anticipate the revolution that’s coming in how humans and AI will collaborate to create discoveries, argues Sam Rodrigues.
AI is now designing chips for AI
AI-designed microchips have more power, lower cost, and are changing the tech landscape.
Why futurist Amy Webb sees a “technology supercycle” headed our way
Amy Webb’s data suggests we are on the cusp of a new tech revolution that will reshape the world in much the same way the steam engine and internet did in the past.
AI chatbots may ease the world’s loneliness (if they don’t make it worse)
AI chatbots may have certain advantages when roleplaying as our friends. They may also come with downsides that make our loneliness worse.
Up Next
robot dog
Exit mobile version