Scientists try to teach robots to laugh

Anyone who has shared a laugh with a friend knows how deeply affecting humor can be, so it stands to reason that our future robot companions have a better chance of gaining our trust and affection if they can laugh with us. But just because a robot tells jokes doesn’t mean I can respond to them properly. Did a comment warrant a polite bot giggle or a full-out bot laugh? The correct answer could mean the difference between an approachable android and a metallic jerk.

That’s why Japanese researchers are trying to teach humorless robot nerds to laugh at the right time and in the right way. It turns out that training an AI to laugh isn’t as simple as teaching it to respond to a phone tree’s desperate plea to cancel a subscription. “Systems trying to emulate everyday conversation still struggle with knowing when to laugh,” reads a study published Thursday in the journal Frontiers in Robotics and AI.

Erica the robot looks ahead with a slight smile

Erica, the humanoid robot, is in the lab getting a sense of humor.

Osaka University, ATR

The study details the team’s research to develop a conversational AI system focused on shared laughter to make conversation between humans and robots more natural. They envision it being integrated into existing conversational software for robots and agents, who are already learning how to detect emotions Y deal with open complexity like vague human commands.

“We believe that one of the important functions of conversational AI is empathy,” Koji Inoue, an assistant professor of computer science at Kyoto University in Japan and a co-author of the study, said in a statement. “Conversation is, of course, multimodal, not just responding correctly. So we decided that one way a robot can empathize with users is to share their laughter.”

The key is that the system not only recognizes laughter, but also decides whether to laugh in response, and then chooses the right kind of laugh for the occasion. “The most significant result of this paper is that we have shown how we can combine all three tasks in a single robot,” Inoue said. “We think this kind of combined system is necessary for proper laugh behavior, not just detecting and responding to a laugh.”

To collect training data on the frequency and types of shared laughter, the team turned to Erica, an advanced humanoid robot designed by Japanese scientists Hiroshi Ishiguro and Kohei Ogawa, as a platform to study human-robot interaction. Erica can understand natural language, has a synthesized human voice, and can blink and move her eyes when she listens to humans talking about their people’s problems.

The researchers recorded dialogue between male Kyoto University students who took turns having a face-to-face conversation with Erica while amateur actresses in another room teleoperated the bot through the microphone. The scientists chose that setting knowing that there would naturally be differences between how humans talk to each other and how they talk to robots, even those controlled by another human.

“We wanted, as far as possible, to have the laughter model trained in conditions similar to those of a real human-robot interaction,” Kyoto University researcher Divesh Lala, another co-author of the study, told me.

On the left, a human talks to Erica, the robot, who is controlled by an actress from a separate room.

Kyoto University

Based on the interactions, the researchers created four short audio dialogues between the humans and Erica, who was programmed to respond to conversations with varying levels of laughter, from none to frequent laughter in response to her human conversation partners. The volunteers then rated those interludes on empathy, naturalness, human-likeness, and understanding. Shared laughter scenarios performed better than those in which Erica never laughs or laughs whenever she detects human laughter without using the other two subsystems to filter context and response.

Researchers at Kyoto University have already programmed their shared laughter system into robots other than Erica, though they say humanoid howls could still sound more natural. In fact, even as robots become more and more realistic, sometimes eerily soroboticists admit that infusing them with their own distinctive human-like traits poses challenges that go beyond coding.

“It may be 10 to 20 years before we can finally have a casual conversation with a robot like we would with a friend,” Inoue said.

Needless to say, Erica isn’t quite ready for the stand-up circuit yet. But it’s intriguing to think that there will soon come a day when she really feels like she gets your jokes.

Leave a Comment