[ 22/July/21 – …”I think that we will only reach reflexive collective intelligence by adopting a language adequate to the organization of the digital memory.”]
Kind of, and not really.
The real issues seem to be deeply more complex.
It is partly about understanding, about models, about analogies, and partly much deeper about the understanding of evolution and the emergence of intelligence.
It is a topic that has fascinated me for almost 60 years, and my autistic mathematical brain has been exploring and abstracting concepts at multiple levels of this topic for most of that time. So my understanding is not something that can be communicated in any reasonable time, and there are some useful analogies that could be useful pointers to someone who is interested in undertaking a similar sort of journey.
Understanding something about the complexity of evolution is fundamental to gaining any sort of real understanding of language and intelligence.
Understanding something of the fundamental uncertainty present in quantum mechanics is one of several ideas that are fundamental to building a useful understanding of evolution.
One needs to be able to appreciate how large collections of things that are fundamentally uncertain, but are constrained by probability relationships, can, in aggregate, very closely approximate classical causality.
That idea is essential.
Every level of structure requires constraints.
Anything without constraint is by definition purely random, and no structure can be maintained in such a system.
So for any level of structure to emerge from any class of system, there must be some degree of reliable constraints within that system.
The hard thing for many will be accepting that such constraints do not need to be of the hard classical form of the binaries our brains are heavily biased to find so attractive, like True/False, Right/Wrong, Good/Evil. And this is a complex story, and it has many levels and layers that weave together.
The conception of evolution that seems to be most common at present (if such a notion has any real utility), is that evolution is all about competition.
That is actually fundamentally wrong, but it aligns with ideas of capitalism and markets so has been strongly selected for by certain levels of social systems (and again this is a deeply complex notion, far deeper than the first order simplistic interpretation, and that interpretation does have a certain level of utility and is a useful approximation in some contexts).
If one looks deeply at evolution, it is much more accurate to say that the emergence of complexity is much more about cooperation than it is about competition.
Every new level of complexity is actually predicated on a new level of cooperation; and at every level cooperation is vulnerable to exploitation and destruction by cheating strategies. So at every level there must emerge sets of cheat detection and removal strategies, and these rapidly evolve into strategic ecosystems.
We are, each and every one of us, deeply nested stacks of levels of such cooperative systems, every level with their own specific strategic ecosystem of cheat detection and removal systems.
At higher levels we tend to call such systems ethics or morality; and they can recurs far beyond any classical interpretation of such things.
So intelligence is by definition a deeply nested stack of systems selected over some deep time by multiple levels of context.
At another level, one can think about evolution as the embodiment of “search”.
One can think of evolution as sets of systems searching the space of all possible systems for those that can actually survive in the real contexts present.
Search is a deeply complex subject, and at higher levels of mathematics and logics (there seem to be an infinite class of possible logics, and classical binary logic of True/False is but the simplest of an infinite set of sets) it can be shown that for a fully loaded processor, the most efficient possible search is the fully random search.
What that means is, that all forms of indexing and ordering of available information and available sets of abstractions and structure, take more processor cycles to build and maintain than does a purely random search of the available data sets (and that can be a deeply abstract concept).
For those people whose brains are necessarily heavily biased by the process of evolution to prefer simplicity and structure and order, it can be a deeply disturbing idea to encounter the notion that random search (search without any form of order or structure) is actually most efficient. Many brains find it very difficult to instantiate higher order abstraction embodying this idea.
And it does seem to be real.
It does in fact seem to be true, in as much as the classical notion of “True” is a reasonable approximation to whatever reality actually is in some particular set of contexts.
This is probably as much as most minds are capable of encountering in a single post – far too much for many; yet it is only the tiny tip of a vast and deep and fundamentally uncertain systemic landscape that seems to be fundamental to whatever intelligence actually is.