How humanity will meet its end is a an endless source of fascination in science fiction.
But scientists claim many of the scenarios depicted in films – such as an asteroid strike and killer robots – may not be as far fetched as you might think.
Now, in a new article for Wired, researchers at Cambridge University’s Study of Existential Risk (CESR) have come up with a list of 10 threats that may some day trigger an apocalypse.
Humanity faces an uncertain future as technology learns to think for itself and adapt to its environment.
Artificial Intelligence, disguised as helpful digital assistants and self-driving vehicles, is gaining a foothold and it could one day spell the end for mankind if allowed to develop without strict controls.
Developments in digital personal assistants like Siri, Google Now and Cortana are just the beginning of the applications
Nick Bostrom, an outside adviser to CESR, predicts that machines will attain 90 per cent of human-level intelligence by 2075, according to Wired.
The threat posed to the human race if they developed beyond our understanding and control has been compared to the development of nuclear weapons.
World-renowned physicist Professor Stephen Hawking and Space X founder Elon Musk are part of growing number of scientists and technology experts who have voiced their concerns in recent years.
Speaking in 2016, Professor Hawking said: ‘I don’t think advances in artificial intelligence will necessarily be benign.’
‘Once machines reach a critical stage of being able to evolve themselves we cannot predict whether their goals will be the same as ours.’
‘Artificial intelligence has the potential to evolve faster than the human race.’
CESR believe that artificial intelligence could create a mulitude of risks that threaten human existence in the near future, as so place it as a very high priority.
Superbugs capable of everything from curing diseases to mopping up pollution could one day become reality, through synthetic biology.
Scientists working in the pursuit of new biological technologies alter existing mirco-organisms at the genetic level, in order to try and better understand their function or to produce a desired result.
Their efforts have already seen the creation of an artificial lifeform in the lab, in March 2016.
But experiments with bio-engineering, or bio-hacking as it is sometimes known, could have unexpected and extremely dangerous results and are being been given a very high priority by CESR.
As current experiments work with self-replicating micro-organisms, like viruses, there is a high risk that any such material that escaped the lab could cause a global pandemic.
The zika virus hit headlines in 2016, over fears that an outbreak of the disease could become a pandemic which could affect more than two-and-a-half billion people.
This is another high priority area of exploration for the CESR team.post was originally published on this site