Vulnerable robots change the social dynamics of a group

by | Mar 10, 2020

Hello HAL, do you read me HAL?

“Imagine a robot in a factory whose task is to distribute parts to workers on an assembly line,” says Sarah Strohkorb Sebo, a Ph.D. candidate in computer science. “If it hands all the pieces to one person, it can create an awkward social environment in which the other workers question whether the robot believes they’re inferior at the task.”

This type of scenario may have seemed like science fiction 30 years ago, but as robot technology evolves the kinds of roles that they can play in the community are expanding. As illustrated in the above hypothetical, simple aptitude for a task should not automatically qualify a robot for that task; the social impacts of the robot’s behavior must also be taken into account.

For robots to be accepted as team members in a community, their possible effects on in the social fabric of that community must be well understood.  In this context, an American team has recently examined the effect of robot behavior on human-to-human interactions in a controlled environment.       

“We know that robots can influence the behavior of humans they interact with directly, but how robots affect the way humans engage with each other is less well understood,” says Margaret L. Traeger, a Ph.D. candidate in sociology.

In this study published in PNAS, participants were split into groups of three, and each group was assigned a robot. The groups played a game in which they worked together to build the most efficient rail road.

At the end of a round, each team’s robot was programmed to do one of three things. The first type of robot was programmed to stay silent. The second type of robot was programmed to talk about something neutral about the score or the number of rounds completed. The final type of robot was programmed to express vulnerability, either by telling a personal story, a joke, or admitting a mistake.

Humans grouped with a vulnerable robot spoke with each other much more than the people who were grouped with the other two types of robots. Those who played the game alongside the joking robots also reported enjoying the game more.

The scientists also looked at more subtle aspects of group dynamics. They found that contributions to conversation were skewed unevenly between the people grouped with a silent robot. In the two groups with talking robots, conversation was distributed much more equally between individuals.

These results clearly show that robot programming can have a significant impact on the way that humans work together.

“We are interested in how society will change as we add forms of artificial intelligence to our midst,” says Prof. Nicholas Christakis, the project leader “As we create hybrid social systems of humans and machines, we need to evaluate how to program the robotic agents so that they do not corrode how we treat each other.”

Quotes adapted from original press release provided by Yale University

ASN Weekly

Sign up for our weekly newsletter and receive the latest science news.

Related posts: