This lip-syncing robotic face could help future robots talk like us


The slight anxiety that creeps up your spine when you see something that behaves like a human isn’t a big deal in robotics — especially for robots designed to look and talk like us.

This strange feeling is called the uncanny valley. One way roboticists are working to bridge this valley is by matching a robot’s lip movements to its voice. On Wednesday, Columbia University Announce research Delves into how a new wave of robot faces can speak more realistically.

Hood Lipson, an engineering professor at Columbia University who worked on the research, told CNET that the main reason robots are so “weird” is that they don’t move their lips like us when they speak. “We aim to solve this problem, which has been neglected in robotics,” Lipson said.


Don’t miss any of our unbiased technical content and lab reviews. Add CNET As Google’s preferred source.


This research comes amid growing hype around robots designed for use at home and at work. in Consumer Electronics Show 2026 Last week, for example, CNET watched A A group of robots Designed to interact with people. Everything from the latest Atlas of Boston Dynamics robot for Home robots Like that Fold the laundryand even A Turtle shaped boot Designed for environmental research, it was featured at the world’s largest technology exhibition. If CES is any indication, it could be 2026 Big year For consumer robots.

Atlas of Artificial Intelligence

CNET

Among these robots are humanoid robots that come with bodies, faces, and artificial skin that mimic our own. CES cohort included Human-looking robots From Realbotix that can operate information kiosks or provide comfort to humans, in addition to Robot from Lovense Designed for relationships Equipped with AI to “remember” intimate conversations.

But a split-second mismatch between lip movement and speech can mean the difference between a machine you can form an emotional attachment to and one that is little more than a machine. Disturbing animation.

So, if people are going to accept that humanoid robots “live” among us in everyday life, they had better not make us feel a little uncomfortable when we speak.

Watch this: Lip sync robot singing a song

Lip sync robots

To make robots with human faces that speak like us, a robot’s lips must be carefully synchronized with the sound of its speech. A research team at Columbia University has developed technology that helps robots’ mouths move like ours do by focusing on how language sounds.

First, the team built a robot face with a mouth that could talk — and sing — in a way that minimized the uncanny valley effect. The robot’s face, made of silicone skin, contains magnetic connectors for complex lip movements. This enables the face to form lip shapes covering 24 consonants and 16 vowels.

Watch this: The lip-syncing robot face makes sounds for individual words

To match lip movements to speech, they designed a “learning pipeline” to collect visual data from lip movements. The AI ​​model uses this data for training, and then creates reference points for motor commands. Next, a “facial motion transducer” converts motor commands into mouth movements that are synchronized with sound.

Using this framework, the robot’s face, called Emo, was able to “speak” multiple languages, including languages ​​that were not part of the training, such as French, Chinese and Arabic. The trick is that the framework analyzes the sounds of the language, not the meaning behind the sound.

“We avoided the language-specific problem by training a model that goes directly from sound to lip movement,” Lipson said. “There is no idea of ​​language.”

Watch this: Lip sync robot face introducing itself

Why does a robot need a face and lips?

Humans have worked alongside robots for a long time, but they have always looked like machines, not humans — the disembodied, highly mechanical-looking arms on assembly lines or the chunky, robot-vacuum-cleaner-like disc that wanders around our kitchen floors.

However, as AI language models used in chatbots become more widespread, technology companies are working hard to teach bots how to communicate with us using language in real time.

There is a whole field of study called Human-robot interaction Which studies how robots coexist with humans physically and socially. In 2024, study from Berlin, which used 157 participants, found that the robot’s ability to express empathy and emotion through verbal communication is crucial to interacting effectively with humans. and Another study 2024 From Italy, active speech was found to be important for cooperation between humans and robots when working on complex tasks such as assembly.

If we are going to rely on robots at home and at work, we must be able to talk to them as we do to each other. In the future, Lipson says, research using lip-syncing robots will be useful for any type of humanoid robot that needs to interact with people.

It’s also easy to imagine a future in which humanoid robots are identical to us. Careful design can ensure people understand they’re talking to a robot, not a person, Lipson says. One example is requiring humanoid robots to have blue skin, Lipson says, “so they cannot be confused with humans.”



Leave a Reply

Your email address will not be published. Required fields are marked *