#1 out of 1
technology1d ago
This Lip-Syncing Robot Face Could Help Future Bots Talk Like Us
- Columbia University researchers built a lip-syncing humanoid face that can speak in multiple languages by syncing lips to audio.
- The system uses a learning pipeline to gather lip movement visuals and an AI model to derive motor commands for speech.
- A facial action transformer converts motor commands into mouth motions that synchronize with audio.
- Emo the robot was able to speak in languages beyond its training set, demonstrating language-agnostic lip movement.
- CES 2026 showcased humanoid robots, including Realbotix and Lovense, highlighting voice and interaction advances.
- Experts say improving lip-sync helps reduce the uncanny valley and makes robots feel more natural to humans.
- Lipson noted there is no notion of language in the model, focusing instead on audio to lip motion alignment.
- The research aims to support future humanoid robots used at home and work with more natural communication.
- Columbia researchers describe Emo’s lip-sync system as a step toward more believable robot speech in real-world use.
Vote 0
