When Robots Learn to Hear
At Duke University’s General Robotics Lab, a diverse team is giving robots a new sense—teaching machines to listen, move, and even learn on their own.
At Duke University’s General Robotics Lab, researchers are teaching robots to sense the world through sound. Led by Assistant Professor Boyuan Chen, the lab explores how machines can perceive their environments through acoustic vibrations rather than sight alone.

“We call ourselves the full stack roboticist,” Chen said. “We work on building both the body and the mind of robots.”

Their latest innovation, SonicSense, gives robots a way to “see” through sound. The robotic hand contains contact microphones embedded in its fingertips that detect high-frequency vibrations as it touches, taps, or shakes objects. Instead of relying on cameras, SonicSense learns from the subtle acoustic cues from each interaction.

“SonicSense will sense the environment with acoustic vibrations instead of perceiving the environment from cameras,” Chen explained. “It sees the world by hearing the sound and feeling the vibrations.”

The project reflects Chen’s philosophy of simplicity in design, which encourages his students to build systems that are robust yet uncomplicated. 

“You start to take components out until the system breaks,” he said. “That’s when you know what really matters.”

Even in noisy environments, the robot performs reliably. During one test, the team blasted music in the lab to see if the robot’s sensors would fail. 

“We play very loud music around it, and the robots work extremely well,” Chen said. “I think it was some sound from Justin Bieber or something.”
 

Hearing as a new sense


For Chen, robotics represents more than mechanical advancement, as it’s an evolution. He often compares robots to a new species that’s still developing both physically and cognitively. 

“When I look at robots, I think they are almost like another species that are also evolving in both their physical form and how they think and behave,” he said.

He believes fear of automation misses the point.

“Robots are just like our partners, and they should be our collaborators,” Chen said. “Combining together, we can achieve something beyond what either humans or machines could do alone.”

Music continues to guide Chen’s approach. After more than 20 years of piano, he finds parallels between composing and designing machines. 

“You’re trying to understand the fundamental principles of how music works,” he said. “In this case, we’re trying to understand how engineering principles work and you use this knowledge to compose and make novel forms of robots or a novel form of music.”

Although robotics is still in its premature stage, Chen sees boundless potential ahead. 

“They’re still in their infancy,” he said. “Hopefully our generation is going to be able to create this new species of robots that can last hundreds of years.”

Video by ASME’s Video Production Team. Article by Aida M. Toro.
At Duke University’s General Robotics Lab, a diverse team is giving robots a new sense—teaching machines to listen, move, and even learn on their own.