From The Guardian.com (Sept. 15):
Engineer who quit over military drone project warns AI might also accidentally start a war.
A new generation of autonomous weapons or “killer robots” could accidentally start a war or cause mass atrocities, a former top Google software engineer has warned.
Laura Nolan, who resigned from Google last year in protest at being sent to work on a project to dramatically enhance US military drone technology, has called for all AI killing machines not operated by humans to be banned.
Nolan said killer robots not guided by human remote control should be outlawed by the same type of international treaty that bans chemical weapons.
Unlike drones, which are controlled by military teams often thousands of miles away from where the flying weapon is being deployed, Nolan said killer robots have the potential to do “calamitous things that they were not originally programmed for”. [read more]
I don’t think Russia or China cares about this engineer’s concern. They’ll make them regardless of anybody's fear.
Other AI stories:
No comments:
Post a Comment