Wednesday, September 04, 2019

Study: Robots Capable Of Developing Prejudice On Their Own

From Study Finds.org (March 16):

CARDIFF, Wales —  Embracing stereotypes or even forming a simple opinion about others may seem like a trait exclusive to humans, but a recent study shows that robots can develop prejudice and even discriminate in similar ways to people, too.

You might think that’s because they’re programmed that way, but the research by computer science and psychology experts at Cardiff University shows that robots and machines using artificial intelligence are capable of generating prejudice on their own.

Joined by researchers from MIT, the Cardiff team explained this discriminatory behavior by suggesting robots could identify, copy, and learn this behavior from one another. Previous research has shown that computer algorithms have exhibited prejudiced behaviors and attitudes, such as racism and sexism, but researchers believe the algorithms learned it from public records and other data created by humans. The Cardiff and MIT researchers wanted to see if AI could evolve prejudicial groups on its own.

For the study, the researchers set up computer simulations of how prejudiced individuals can form a group and interact with each other. They created a game of give and take, in which each individual virtual agent makes a decision whether or not to donate to another individual inside their own working group or another group. The decisions were made based on each individual’s reputation and their donating strategy, including their levels of prejudice towards individuals in outside groups. [read more]

Interesting. So, AI can be bigots, racists and sexists too? Hmmm. They’ll probably be prejudice against humans then humanity will be screwed.

Other articles on AI;

No comments: