CMU logo
Expand Menu
Close Menu

CMU Researchers Find Cooperation Takes a Backseat in Automated Game of Chicken

News

An experiment in which people played games of chicken with partially automated toy cars suggests that social norms, such as taking turns, may collapse as people delegate more decision-making to machines.

An experiment in which people played games of chicken with partially automated toy cars suggests that social norms, such as taking turns, may collapse as people delegate more decision-making to machines.

Hirokazu Shirado, an assistant professor in the School of Computer Science's Human-Computer Interaction Institute, said players who controlled self-steering cars became more self-centered and reckless as the game progressed and were less likely to communicate or coordinate with the other drivers. This effect continued even after the self-steering feature was deactivated.

"Once you lose such a social norm, it takes time to recover it," Shirado said.

Shirado notes that the social repercussions of automated systems are often overlooked. But the experiments he conducted with colleagues Shunichi Kasahara of Sony Computer Science Laboratories and Nicholas Christakis of Yale University suggest system designers would be well advised to consider these consequences.

The online experiment, discussed in "Emergence and Collapse of Reciprocity in Semi-Automatic Driving Coordination Experiments With Humans" and published this week in the Proceedings of the National Academy of Sciences, involved 300 people who remotely controlled toy cars in a game of chicken. Each player operated one of two cars headed in opposite directions on a single-lane road with the goal of reaching the end of the track as quickly as possible without colliding.

In some games, the cars had automated braking. Other games featured self-steering cars. In each case, the automated systems were designed to prevent head-on collisions. The auto-braking cars would stop before they collided, at which point each player would then steer their car around the other to complete the course. The auto-steering cars, by contrast, would automatically veer around each other.

The nature of the artificial intelligence assistance proved significant. Players with auto-braking cars tended to reduce speed, take turns and communicate with each other to avoid a wreck — normal social behavior. Those with auto-steering cars, by contrast, increased the speed of their cars and didn't communicate with their counterparts, relying on the automated steering to veer at the last second.

Shirado explained that auto-braking is a supportive technology, so the players retained more control over their vehicles. Auto-steering, however, serves as a substitute for human cognition. That ceding of control appears to contribute to the antisocial behaviors seen in the experiment.

The implications of the experiment extend well beyond self-driving cars. AI is being applied to a wide range of problems and tasks, from financial fraud detection and personalized medicine to pollution reduction and personnel decisions. As such, it would make sense for designers to consider if their systems require a machine to make critical decisions or if it should simply help humans make those decisions.

Shirado said he would also like to explore whether social norms, such as reciprocity, can somehow be programmed into AI. But that could prove tricky. Humans are social animals and learn social norms by doing simple social things, such as going to church or playing on a baseball team.

"Machines don't work like that," Shirado said.

This research was sponsored by the Robert Wood Johnson Foundation and the NOMIS Foundation.

 

By Byron Spice
For More Information
Aaron Aupperlee | 412-268-9068 | aaupperlee@cmu.edu

Related People
Hirokazu Shirado