Robots Capable Of Developing Prejudice On Their Own

Study: Robots Capable Of Developing Prejudice On Their Own

by Ben Renner March 16, 2019

CARDIFF, Wales —  Embracing stereotypes or even forming a simple opinion about others may seem like a trait exclusive to humans, but a recent study shows that robots can develop prejudice and even discriminate in similar ways to people, too.
You might think that’s because they’re programmed that way, but the research by computer science and psychology experts at Cardiff University shows that robots and machines using artificial intelligence are capable of generating prejudice on their own.
Joined by researchers from MIT, the Cardiff team explained this discriminatory behavior by suggesting robots could identify, copy, and learn this behavior from one another. Previous research has shown that computer algorithms have exhibited prejudiced behaviors and attitudes, such as racism and sexism, but researchers believe the algorithms learned it from public records and other data created by humans. The Cardiff and MIT researchers wanted to see if AI could evolve prejudicial groups on its own.
For the study, the researchers set up computer simulations of how prejudiced individuals can form a group and interact with each other. They created a game of give and take, in which each individual virtual agent makes a decision whether or not to donate to another individual inside their own working group or another group. The decisions were made based on each individual’s reputation and their donating strategy, including their levels of prejudice towards individuals in outside groups.
As the game progresses, a supercomputer performs thousands of simulations, and each individual begins to learn new strategies by copying the others within their group or the entire population.
“By running these simulations thousands and thousands of times over, we begin to get an understanding of how prejudice evolves and the conditions that promote or impede it,” explains co-author professor Roger Whitaker, from Cardiff’s Crime and Security Research Institute and the School of Computer Science and Informatics, in a release.
“Our simulations show that prejudice is a powerful force of nature and through evolution, it can easily become incentivized in virtual populations, to the detriment of wider connectivity with others. Protection from prejudicial groups can inadvertently lead to individuals forming further prejudicial groups, resulting in a fractured population. Such widespread prejudice is hard to reverse.”
The study was published in the journal Scientific Reports.

Comments

Popular posts from this blog

BMW traps alleged thief by remotely locking him in car

Report: World’s 1st remote brain surgery via 5G network performed in China

New ATM's: withdraw money with veins in your finger