Rating Existential Risk?

The Lifeboat Foundation is asking for help in allocating a hypothetical $100 million to study a list of “top 10 existential risks”.

Their list:

Biological viruses…
Environmental global warming…
Extraterrestrial invasion…
Governments abusive power…
Nanotechnology gray goo…
Nuclear holocaust…
Simulation Shut Down if we live in one…
Space Threats asteroids…
Superintelligent AI un-friendly…
Other

An existential risk is the risk of an event that is both global and terminal to Humans on Earth.

Given that ultimately life WILL end on Earth at some point, I think the question is whether one of these events might occur in the next hundred years and should we do anything about it now. Keep in mind that spending money on something doesn’t guarantee a solution. Also, assuming a pessimistic doubling of technological capability in the next hundred years we’re much more likely to be able to do something about “it” in the future than we can now.

I only see one realistic, near term, existential risk on the list and that’s biological (viruses). The paradoxical thing is that viruses alone don’t pose much of a risk because if they did, life would have already ended on Earth. The existential risk from viruses is from biological research and the chance that scientists might create a super bug that wouldn’t have otherwise existed. So to mitigate that risk we just have to stop all medical research… I don’t see anyone wanting to do that, do you?

So is our $100 million better spent now or should we wait for better technology and more knowledge in the future?

I would pick other, and invest the $100 million so I’d have plenty of money to work on problems in the future when we would have a better chance of actually doing something about them.

This entry was posted in Economics. Bookmark the permalink.