Technology and humanity

If technological growth allows no room for nature to recuperate, what are the effects to the human race?

[ 12/Mar/21 ]

Mohammad Gani {in his answer} displays clearly why economics and finance are now the single greatest threat to human survival; and that is from a failure to understand the complexity and inter-relatedness of living systems.

If one is looking at having a very small impact on systems, then it makes sense to ignore and devalue to zero those things that are abundant, which is what markets do in practice. Using market measures of value tend to put focus on the things that are scarce, and incentivise increasing their relative abundance (to a point).

There are two very different and existential risks involved in that.

1/ The fact that markets value anything that is abundant at zero tends to ignore effects on abundant things. Just because something is abundant does not make it unimportant. Oxygen in the air being the prime example (but there are actually many others). Oxygen is arguably the most important thing for any human being. Deprive us of it for just a few minutes and our systems degrade beyond the ability to reboot – we die!. Because of the very long lag times between the systems that produce oxygen and those that consume them, it is entirely possible to move the oxygen producing systems into a decay state which will lead to our extinction before we can engineer recovery. A pure focus on market values as indicators is insufficient to guard against this form of failure modality, a deeper and more integrated systems understanding is required. Everyone needs to become conscious that just throwing unwanted stuff into water bodies might have been ok when there were few of us throwing very little, but now there are a lot of us throwing a lot of stuff, and we have overwhelmed the ability of the natural systems to deal with it. We have time to fix that, there is a reserve of oxygen in the atmosphere, and we do need to fix our systems long before there is any noticeable lack of oxygen. If we wait for a lack to show up, then it will be too late.

We do actually need to start consciously designing all systems to be fully recycling all substances. That means some fundamental changes to how we do some things (like stopping using antifoul paints on boats that work by releasing toxins into the local environment).

2/ The fact that markets require scarcity to deliver value was not a major problem when most things were in fact genuinely scarce, and most things required human labour for production. The context has now fundamentally changed.

We now have the ability to fully automate systems, and to fully meet all reasonable demand. Under such conditions, as Prof Gani correctly identifies, economic value drops to zero. This is the major social issue of our time.

We now have the technological ability to meet the reasonable demands of all people on the planet for all of the essentials of life, and most of the optional things that they want, by using fully automated systems, but doing so necessarily drives the value of such outputs to zero. This generates an economic meta-incentive to retain poverty for the masses. While this may seem sensible to many in the economics profession, to most of the rest of us it appears clearly to be immoral and unacceptable (to use a games theoretic set of terms, it is a cheating strategy on the high level cooperative that is humanity). And to get a feel for that “games theoretic” understanding, one needs to appreciate a fact about the evolution of complexity that is not well understood by most, but needs to be; and that is that all new levels of complexity in evolved systems are based upon new levels of cooperation; and all levels of cooperation require sets of attendant strategies to detect and remove cheating strategies on the cooperative if they are to survive long term. The details of the sorts of environments in which such things can evolve are deeply complex, but in essence come down to some sort of context where the threats from sources outside the population are greater than the competitive threats from within the population.

We do now seem to be in such a context, where we face overwhelming long term threats from external factors that require cooperation to survive. That means real cooperation between international level agents, that respect the rights of such diverse sets of self-aware agents to survive, and to have such degrees of freedom as they can responsibly exercise. Responsibility at this level means demonstrating awareness of the many levels of necessary sets of boundaries required for the continued existence of the many levels of cooperative agency present. Competition without a cooperative base is always an existential level threat to agents.

So as Daniel Liberman noted, technology in and of itself, is not the issue.

It is what we do with technology that matters.

Any form of liberty without appropriate levels of responsibility poses existential level threat.

Liberty is an essential part of humanity and creativity.

Any real expression of liberty results in diversity.

All such diversity must be accepted and respected.

It is clear that evolution has (for very good evolutionary reasons) equipped us with brains that subconsciously simplify our perceptions in ways that were close enough to allow our ancestors to survive, and allow us to build our simplistic understandings of the complexity that we are and within which we exist. Simple models have the utility of allowing us to respond quickly in emergencies, and when under attack by predators being the last one to move is strongly selected against. So there has been a lot of pressure against the use of complex models and understandings. But simple models can fail in many different ways when contexts change in ways that are not immediately obvious.

When faced with a reality that has multiple levels of self aware agency, all with differing sets of limits required in their definitions of “responsibility” and faced with a reality that has multiple levels of uncertainty and unknowability, then we all need to accept that mistakes will happen, and that sometimes the uncertainty will be so great that we need multiple sets of “safe to fail” experiments proceeding simultaneously. And in thinking about such things it pays to consider one of the proofs from database theory, that for the fully loaded processor, the most efficient search possible is the fully random search – meaning that sometimes the implicit limits of “expert knowledge” can be more hinderance than help. Some fraction of all such searches needs to be allocated to those outside of the domain of experts, and random is as good a mechanism as any to do so.

And it depends very much how one interprets the question, as to how one answers it. The answer above includes the (perhaps) implied assumption of economic growth.

If we go beyond economics, and accept that all finite systems have necessary sets of boundaries, then we can continue to explore infinite spaces (such as the space of all possible ideas, and all possible technologies) even as we accept that there are finite limits to the expression of any particular technology in any particular context; then we can continue to grow our knowledge and our options and our creativity; provided that we are responsible for all such levels and instances of real limits as do actually exist.

About Ted Howard NZ

Seems like I might be a cancer survivor. Thinking about the systemic incentives within the world we find ourselves in, and how we might adjust them to provide an environment that supports everyone (no exceptions) - see
This entry was posted in economics, Nature, Technology, understanding and tagged , , , . Bookmark the permalink.

Comment and critique welcome

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s