If you have been paying attention to the global news, you will know that there’s no shortage of potential threats that could theoretically wipe out humankind. A huge asteroid that’s coming towards Earth and whose impact could wipe out humanity? Computers and robots taking over the world like in many works of science fiction? These are just some of the real threats that could pave the way for a catastrophic extinction event. So it is not a surprise that there’s a branch of academia that is dedicated to studying these scenarios in an effort to prevent them from happening in the first place.

We’re talking about the Future of Humanity Institute. In this article, you will learn about this research center and how they can help with predicting and preventing potential risks to humankind.

What is the Future of Humanity Institute?

Future of Humanity Institute

The Future of Humanity Institute (FHI) is a branch of the Faculty of Philosophy and the Oxford Martin School at the University of Oxford. The institute is founded in 2005 by Prof. Nick Bostrom. The main mission of the FHI is to make use of tools of philosophy, mathematics, science, and social sciences to look at a bigger picture of the human civilization. The institute believes that humanity has the potential to experience a long and flourishing future. However, there are numerous crucial considerations that shape that future. The Future of Humanity Institute is entrusted with the mission to shed light on these considerations.

The Founder of FHI

Nick Bostrom

Nick Bostrom is the founder of FHI back in 2005, at the age of 32. He founded the institute two years after coming to Oxford from Yale. If there’s a poster boy for the threats against humankind, then it has to be Bostrom. He always tends to attract press attention because he always writes a great deal about the extinction of humankind. So much that his work has earned him a reputation as a secular Daniel – a doomsday prophet if you will.

With his growing audience, he finds himself giving keynote talks on extinction risks at global conferences. Recently, he became an advisor to the Study of Existential Risk at Cambridge while working with renowned physicist Stephen Hawking.

The Main Threats to Humankind According to FHI

The Main Threats to Humankind According to FHI

With the ever-changing world, all sorts of threats are rearing their ugly heads. FHI makes sure that they’re on top of these threats for humanity’s long-term future. Some of the most likely threats according to the FHI are:

Extinction Risks by Nature

Nuclear Weapons

Superintelligence

Wrap Up

As the world progresses, there is no doubt that the threats to our very existence will only multiply. As you can see, you can forget about cosmic catastrophes and nuclear wars, artificial intelligence is the most likely threat. It is both humanity’s important achievement and a daunting challenge. The Future of Humanity Institute is doing a great job to make sure that we as a species stays on top of these existential risks.

Pin It on Pinterest

Share This