Uh-oh! Stephen Hawking says AI could be humanity's 'worst mistake in history’

Johnny Depp’s latest flick, Transcendence, might’ve been a dud at the box office, but the artificial-intelligence-based plot did get the planet’s resident genius thinking. Turns out we could be doomed.

World-renowned scientist Stephen Hawking wrote an intriguing op-ed for The Independent looking at the potential future of A.I., and according to Hawking, we could be in the process of building our future robo-overlords. Apparently Battlestar Galactica, The Matrix, Terminator et al. got it right after all.

Here’s an excerpt from the piece, where Hawking warns we should be very careful about what we build, and how we build it:

“One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.

So, facing possible futures of incalculable benefits and risks, the experts are surely doing everything possible to ensure the best outcome, right? Wrong. If a superior alien civilization sent us a message saying, "We'll arrive in a few decades," would we just reply, "OK, call us when you get here – we'll leave the lights on"? Probably not – but this is more or less what is happening with AI.

Although we are facing potentially the best or worst thing to happen to humanity in history, little serious research is devoted to these issues outside non-profit institutes such as the Cambridge Centre for the Study of Existential Risk, the Future of Humanity Institute, the Machine Intelligence Research Institute, and the Future of Life Institute. All of us should ask ourselves what we can do now to improve the chances of reaping the benefits and avoiding the risks.”

In the full article, Hawking notes A.I. could be our “worst mistake in history,” and after science fiction has shown us time and time again how that could be the case, it’s a warning we might want to heed.

What do you think? Are we building our future rulers? Do you, for one, welcome our robot overlords?

(Via The Independent)

More from around the web