[ 30/September/21 ]
Jennifer Armstrong accurately notes that survival of the fittest is not the “strongest”, but more about the sense of “fit” as in the “fit” of a jigsaw piece into the puzzle. The exact “shape” of the environment determines what is the best “fit” to that shape. Environments can vary significantly across quite small changes in some dimensions. So fitness is not a static thing, but something that varies constantly with time and circumstance and is effectively averaged over time and space for most individuals that manage to survive.
That actually gets deeply complex, very quickly, because there are always multiple levels present in complex organisms like us.
The issue of morals can be far deeper and more complex than anything Neitzsche ever dreamt of, and in some contexts the issues that he raised can be significant.
Every level of structure requires some sorts of boundary constraints to sustain that level of structure. At higher levels of groups structure, morality is usually some reasonable approximation to an optimal set of constraints for the history of that group.
Severe issues can arise when contexts are rapidly changing, and old moralities are no longer appropriate.
One of the hardest things for many to appreciate is that all new levels of complexity are built on new levels of cooperation. Often key aspects of morality are essential to the maintenance of that cooperation required for that new level of complexity.
Things start to get deeply complex when one looks at the sort of stability that can arise in complex societies, and the reasons for Dunbar’s Number (which is typically about 150, and is the size of social groups that can maintain stable cooperation – it seems to be a function of human memory, both capacity, and long term reliability). Modern digital long term memory can give us the capacity to develop reliable mechanisms to support cooperative complexity at the scale of any number of people the sun is capable of supporting in our solar system; but centralisation of those systems cannot ever be stable, as they are too prone to central point of capture. Stability can only arise with massive sets of redundant systems, and everyone having secure individual copies of all data important to them. Even central block chains are not actually a stable solution to this set of issues long term.
Every level of freedom demands an appropriate level of responsibility if it is to survive long term.
Responsibility without freedom is slavery.
Freedom without responsibility eventually destroys the very foundations that made it possible – and so eventually self terminates.
All forms and levels of morality tend to be some approximation to an optimal set of solutions to the problems above in some set of contexts.
We are in a time of exponential change.
We are in a time of exponentially expanding complexity.
We either develop effective moralities appropriate to our contexts, or we perish.
Understanding that, in terms of complex systems, it is cooperation, not competition, that is the fundamental driver of evolution, is fundamental to survival. And that rapidly gets deeply complex at every level, as raw cooperation is always vulnerable to exploitation by cheating strategies, and requires evolving ecosystems of “cheating strategy” detection and removal systems.
When you combine that, with living in a reality with multiple classes of fundamental uncertainty and more than a few of fundamental unknowability, then it gets deeply messy and uncertain very quickly. And we are developing mathematical systems and processes that can deliver reliability even in the face of such uncertainty.
Morals basically became crucial as soon as we developed language, and in some senses, some aspects far predate language.
In an age of modern digital information processing systems, morality is fundamental to our survival.
With the technology available today, the probability of anyone surviving all out competition is so close to zero that the difference isn’t worth worrying about.
Cooperation is the only survivable game in town.
Morality is fundamental to cooperation at higher orders.
I think, the moment you offer more to one person than the other, or something akin, the thought of fairness enters. We’re born knowing right from wrong.
LikeLike
There is certainly some truth in that. That level of fairness is deeply built into our neural networks, and there are other levels that need to be learned.
LikeLike