Without doubt there are existential risks from things like super-volcanoes, comets, meteors, supernovas and other such geological and astronomical phenomena.
Without doubt there are also existential risks from various biological and technological scenarios.
On the subject of AI, the idea that control is possible is a nonsense, we would simply need to accept our reliance on its much greater awareness,and its own long term best interests.
However, all of those risks pale in comparison to the existential risks of the incentive structures within our dominant paradigms of thought – in particular money.
Most human beings, and most institutions, give money a high weighting in consideration of actions. This is a serious risk to our survival on several levels.
Money is not a measure of human value. It is perhaps best thought of as the product of “human value” and “perceived scarcity”.
As such, using money as a substitute for human value in decision making has some very perverse outcomes.
Humans value abundance, and our survival at many levels relies on redundancy of systems.
Money values abundance at zero, and thus attempts to eliminate redundancy wherever it occurs.
When money is used simply as a tool to lubricate the exchange of goods and services, it has great utility to humanity.
When the pursuit of money, in and of itself, is allowed to dominate our decision making processes, then our entire system is in grave systemic risk from becoming too focussed on short term efficiency at the cost of both human values of the majority, and the necessary redundancy for the infrequent and high impact events that over time and space must happen sooner or later (and if Murphy is in the house, it is likely to be just too soon for many).