[ 17/8/20 ]
Tony Castaldo has part of the answer – the profit motive.
Thomas Ruddick has a part of it in that change is often about finding easier ways of doing things, for all sorts of motives, and often from ignorance and necessity.
And there are other important aspects of this very complex evolving system:
1/ The complexity of reality. It is now proven beyond any shadow of reasonable doubt that not only is reality complex beyond the ability of any human to deal with in detail, it is complex and contains fundamental uncertainties that make it beyond the capacity of any computational entity to deal with in detail with 100% accuracy. And in some contexts it is possible to achieve very high levels of accuracy (over 99,999999%), but never in all contexts.
2/ Our experience is of our own personal subconsciously generated model of reality, not reality itself. Thus we are all subject to multiple levels of genetic and cultural bias in how our individual experience of being is assembled. The more work we do to understand those biases the greater degrees of freedom we can achieve, but if we fail to do that work, then such biases are available for exploitation.
3/ Evolution selects mostly on speed and accuracy of response to danger, if the costs of false positives is low enough, it can tolerate a lot of them. Missing a real one has very high cost. Thus we all have a great many tendencies to see dangers that are not necessarily real. Those tendencies can be exploited by AI systems.
4/ Any value measured in a market has a component of scarcity built in. Things universally abundant, even if vitally important, have zero market value – like air. The flip side of that, is that markets are incapable of meeting the real needs of everyone, in and of their own internal incentive structures.
5/ The use of money dis-intermediates our reliance on reality. We tend to see the symbol (money) as having value, rather than the things it symbolises. We think we need money to survive, but what we really need is environments, ecosystems, goods, services and network connections. Money only works in so far as all those other things actually exist. A failure to appreciate that reality can lead to a focus on money, rather than a focus on goods and services in reality, which leads to systemic failure.
6/ Exploitation of unconscious bias for profit by AI system. This happens because there exist enormous datasets of what actually captures the attention of specific individuals, and the AI systems can use those to optimise feeds into our networks of whatever it is that holds our attention. They do this for the simple expedient that doing so increases advertising revenue. They do it by the mechanism of exploitation of multiple levels of unconscious bias.
7/ This leads to the creation of “echo chambers” of individuals with particular beliefs, who then get those beliefs reinforced. In some cases the reinforcement is such that communication with other groups is no longer possible.
The historical mechanisms of living in physical communities with the physical necessity of social interaction with a diverse set of individuals (like in physical markets), and the real experience of needing that diversity to survive day to day is no longer present.
Money has isolated us from physical reality, in that we can think only of money, not the ecologies and technologies and networks of people and systems and understandings that make it all work. Many think it is all just money – the real complexity of the real networks and relationships is lost.
The fundamental role of cooperation and trust in making those networks happen, is lost. Many people mistakenly believe that it is all money and competition, when the reality is much more closely approximated with it being all about cooperation and trust.
So social networks allow for some groups to become ever more confident that their particular overly simplistic and fatally flawed “Truth” is correct, and everyone else is wrong and bad – leading to ever greater potential for conflict; rather than acknowledging that all understandings are necessarily wrong, and some may be less wrong than others in certain contexts.
Being able to compute any set of vectors with 99.999% accuracy in 20 seconds is of no survival value at all if the bus heading towards you will intersect with your path in 6 seconds. Near enough to be survivable is what counts in evolutionary contexts – all levels of biology and culture.
The idea of “Truth”, as in one true anything, is now one of the greatest threats to humanity.
We all need more humility than that.
We all need to accept that we are fundamentally fallible, and that other people may have different and workable ways of surviving.
And that needs to also accept the reality that all levels of complexity require boundaries for survival; at every level of biology and culture there will be real limits in any particular context, and going past those limits is not survivable.
So respect for this aspect of what is survivable in any particular real situation must come before respect for diversity – if we are to survive.
And that is always complex and uncertain, particularly when dealing with multiple levels of awareness simultaneously.