[ 26/Jan/21 Facebook – Lifeboat Foundation]
The list given of “The Biggest Threats To Human Existence” seems to me to be a very strange mix of categories.
- Nuclear war
- Bioengineered pandemic
- Artificial Intelligence
- Super volcanoes
- Food supply
I would classify them very differently:
1/ The intentional use of technology against other groups resulting in extinction of all; which could include any set of technologies, including but not limited to: nuclear weapons, biotechnology, nanotechnology, AI, …
And ultimately the cause is not the tool, but the faulty set of assumptions used to generate a suicidal strategy (like nuclear war or any form of genocide).
The most common sets of such suicidal strategies are:
i/ The idea the evolution is all about competition, whereas the reality is far more complex, and it is actually the case that all new levels of complexity are built upon new levels of cooperation. And naive cooperation is always vulnerable to exploitation and destruction by “cheating” strategies, and thus requires what become ecosystems of cheat detection and mitigation strategies in order to survive. Thus it is much more accurate to say that at our level of complexity, evolution and survival is all about cooperation. And that is a very deep exploration of strategy in uncertainty.
ii/ The idea that markets are a reasonable measure value. In times when most things were in fact genuinely scarce, then one could make a reasonable case that markets were a useful tool. But in an age of advanced automation, where the vast bulk of goods and services can in fact be manufactured and delivered by fully automated systems, then markets fail completely, and actually incentivise multiple levels of existential level risk (which can lead to both intentional and unintentional use of all the tools in 1 above).
iii/ A failure to accept and respect that all knowledge and rules are necessarily simplifications of reality, and thus subject to failure modalities to which we are necessarily blind. While it is true that all levels of structure require boundaries, those boundaries need to be flexible and responsive to changes in context, or the entire structure becomes brittle and fails. No set of rules can ever be appropriate to all contexts, and if we are lucky then most of the ones we have should be reasonable approximations to some set of optima in most contexts.
iv/ A failure to accept and respect that freedom necessarily results in diversity. When that diversity extends through multiple levels of abstraction and understanding, it can be very difficult for many to accept the deep levels of uncertainty that necessarily result from such complexity.
And we need to accept that the definition of being human is the use of tools and technologies. That is in fact how we distinguish our human ancestors through the geological record.
2/ Earth based threats to aerobic life.
a/ Super-volcanoes (the explosive types like Yellowstone or Toba)
b/ Flood basalts – like the Deccan Traps
c/ Extreme climate variation
d/ Anaerobic ocean overturn
There may potentially be others, but they are very low probability – this set at least we have evidence of having happened in the past, and having caused species level extinctions as a result.
3/ Events sourced outside the earth but within the solar system:
C/ extreme solar activity
4/ Events sourced outside the solar system:
i) energy from supernova in close proximity
ii) other high energy pulses like a Pulsar pointed our way
iii)collision with mass of extra solar origin – meteor like, comet like, dark star, black hole, etc
iv) exotic matter of extra solar origin
v) other dangers of unknown type (we don’t know what we don’t know).
vi) exotic aggressive life forms – put here simply for completeness, as it seems highly improbable that any aggressive life form could survive long enough to reach a level of complexity capable of crossing interstellar distances.
We can develop mitigation measures for all the known types of risk; and survival of the unknown can only be a matter of probabitities, which are enhanced by cooperation, diversity, self sufficiency and massive redundancy.
The core take out message from this, is that any realistic probability of survival demands cooperation between multiple levels of diverse agents, and such cooperation demands that all levels of agents have what they consider to be reasonable levels of freedom and security. That is anathema to market based thinking.
The age of markets must end.
The insanity of strategies that promote global interdependence of aggressive systems is neither stable nor survivable long term. Something else entirely is required. That entire games theoretic structure is founded on overly simplistic sets of assumptions about the nature of the reality in which we exist.
The age of freedom, responsibility, abundance and diversity must begin – if we are to survive.
And there is no denying responsibility. At every level, freedom without responsibility is necessarily self terminating – the logic of that is inescapable.
[followed by Daryl Tempesta replied – The idea of evolution is about suitability in a changing environment.]
Evolution is in a sense simply about survival.
The sorts of strategic complexes that survive in different sorts of environments can be very different.
Every form of life present on earth today seems to have been evolving for an equal length of time, but some of the contexts have favoured relatively simple strategies (like bacteria, archaea, viruses) and some have allowed for greater complexity to emerge.
Some environments change very little.
Some change a lot.
Complex adaptive life like us is capable of both changing environments quickly, and adapting to such change. But it is not a given that any particular group of individuals will in fact change in ways that are survivable. We actually have quite a bit of evidence to suggest that such is actually quite rare.
[followed by Daryl Tempesta “Age of commerce will never end. Supply and demand is a difference machine embedded into human nature.”…]
Arguably the prime need of every human being is oxygen.
We get all we need by breathing.
There is no need of a market for oxygen in the air.
The context is such that an abundance is present when needed for most people in most contexts.
We are capable of creating automated systems that deliver a similar abundance of most things required for human beings to have reasonable degrees of security and freedom (and both of those notions must eternally contain uncertainties and unknowns).
Our brains are strongly biased to notice threats, and ignore most other things.
Thus we tend to ignore the abundances actually present, and focus on the scarcity. Some good historical reasons for that, but in an age of automation, things change – fundamentally.