How humans will lose control of artificial intelligence

This is the way the world ends: not with a bang, but with a paper clip. In this scenario, the designers of the world's first artificial superintelligence need a way to test their creation. So they program it to do something simple and non-threatening: make paper clips. They set it in motion and wait for the results — not knowing they've already doomed us all.

http://theweek.com/articles/689359/how-humans-lose-control-artificial-intelligence