New Cambridge Research Center Aims To Save Us From Robopocalypse

New Cambridge Research Center Aims To Save Us From RobopocalypseThe Matrix.

No longer merely the stuff of science fiction movies and paranoid quacks on street corners, scientists are starting to get serious about the threat to mankind posed by machines— specifically those that will be invented within the next 100 years or so.

At the University of Cambridge in England, a newly formed department dubbed the Center for the Study of Existential Risk (CSER) aims to identify, and curb, specific developments in science and technology that could potentially endanger human civilization as we know it.

"Many scientists are concerned that developments in human technology may soon pose new, extinction-level risks to our species as a whole," reads the deadpan and terrifying statement of purpose on CSER's website. "Such dangers have been suggested from progress in AI, from developments in biotechnology and artificial life, from nanotechnology, and from possible extreme effects of anthropogenic climate change."

The center was jointly founded by the astrophysicist Martin Rees, the philosopher Huw Price and the software magnate Jaan Tallinn, a co-founder of Skype. Of particular concern to the trio are the prospects of artificial general intelligence (AGI) and the point when computers are developed that surpass the intelligence of the human brain, a phenomenon sometimes referred to as the Singularity.

“Think how it might be to compete for resources with the dominant species,” said Price in an interview yesterday with The Register, referring to superintelligent computers. “Take gorillas for example— the reason they are going extinct is not because humans are actively hostile towards them, but because we control the environments in ways that suit us, but are detrimental to their survival.”

“Nature didn’t anticipate us, and we in our turn shouldn’t take artificial general intelligence for granted," he continued. "We need to take seriously the possibility that there might be a ‘Pandora’s box’ moment with AGI that, if missed, could be disastrous."

So. According to the fourth oldest university in the world: Pandora's Box and insidious robots— Things About Which You Can Officially Start Worrying. #2012

[via SAI]

Tags: center-for-study-of-existential-risk, artificial-inteligence, robopocalypse
blog comments powered by Disqus