Jabba the Hutt inspires human-style eyes for robots

How it works

It has taken over four years of lab experiments to get the right materials for robotic eyes to function like human eyes. You can see the results in the lead image. To replicate the soft tissue of the human iris, we 3D-printed acolourized gelatinemembrane using a digital map of a human eye. Unlike glass and acrylic, gelatine is natural, highly flexible, and can hold an image. In the center of each iris are holes, which we refer to as portals. One can fit a camera to see the world, while the other is for a photo-sensor to measure light.

To make the pupils expand and contract, as they do when people are happy or scared, we made anartificial musclefrom a stretched silicone membrane, coated on each side withgraphene. When they are activated, they squish the silicone membrane together giving a contraction effect. Graphene is so fine that a single coating allows light to pass through it like a human eye.

An artificial muscle is activated by creating a positive and negative field of static electricity on each side of the membrane that compresses and relaxes the muscle with high and low voltage. Think of squeezing an object on each side, and then letting it release. The surface area will increase and decrease with the amount of pressure you apply to it. To further help create human-like eyes we used a flexible3D-printed materialto hold the artificial muscles and sensors in place.

Another feature of the robot eye is that it can respond to both light and emotion simultaneously. This is vital for accurately emulating the functionality of the human eye, but robot eyes have not been able to do it before.

A microprocessor switches the robot’s eyes between emotion and light modes so the eyes can react just as human eyes do: in humans, our pupils expand in reaction to light and happiness, and contract in darker places and when we are more unhappy.

When the robot is interacting with someone, a camera uses machine-learning software to predict their emotional state from their facial expressions. This assigns an emotion to the robot such as happiness or sadness and sends a message to the pupils to expand or get smaller accordingly. Similarly, in light mode, the robot’s pupil dilates in darkness and shrinks in brighter conditions.

Why robots need human eyes

The benefit of creating more lifelike robots is that they allow people to interact with technology more naturally. This is important as for some people, a human-like interfaceis comfortingand canimprove how humans reactto robots.

This should make it easier for robots to socially interact with people, which could be useful forpeople living alone. Over time, robots will hopefully be able to provide them with additional support and companionship. To learn more about how robots’ eyes needed to react, I carried out an experiment with people watching different videos. They then stared into an artificial light at different levels of brightness. Participants wore a pupil-tracking headset which recorded their pupil dilation and light frequencies, and this was used to fine-tune the robot eyes.

In the final experiment, robotic eyes were installed inside a realistic humanoid robot and compared against a robot with standard acrylic eyes. They were then tested out on humans to measure emotion and attention. Participants who noticed the robots’ eye dilation showed heightened emotion and attention levels.

These results show the benefits of these robotic eyes so people react to them more naturally. This is important because otherwise humanoid robots appearunemotional. By replicating subtle gestures and cues, we increase our understanding and familiarity with robots. This includes things likelip synchronization, speech tonality, and facial expressions.

One day we could have robots that are so human-like that are virtually indistinguishable from ourselves, even when looking into their eyes.

The code, schematics, and video footage of the robotic eyes are now available onGithubfor any engineer to improve the design and use them in their own projects.

Article byCarl Strathearn, Research Fellow, Computing,Edinburgh Napier University

This article is republished fromThe Conversationunder a Creative Commons license. Read theoriginal article.

Story byThe Conversation

An independent news and commentary website produced by academics and journalists.An independent news and commentary website produced by academics and journalists.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with

More TNW

About TNW

Stop spending cash on giving workers skills that’ll be useless in a decade

Maker of Sophia the robot plans to sell droids to people seeking company during COVID

Discover TNW All Access

Grace the robot nurse can’t replace human caregivers

Shh, listen to this expert explain how search-and-rescue robots work