While I agree that Hayek had some profound insights, and that the information aspect is important, and the complex adaptive systems aspects are important, and that Fred raises two good points, I still see major omissions.
Two fundamental and critical concepts are not mentioned:
1 the cooperative aspect of evolution; and
2 the impact of fully automated production on markets.
Both contain aspects critical to understanding the incentives to systemic failure, and how to avoid them.
Viewing evolution simply as competition is faulty.
All complex systems contain both competitive and cooperative elements.
In the most simple form, if the risks to individuals comes mostly from other individuals like self, then competitive systems will emerge and dominate, and the systemic incentive is to drive to simplicity.
If the risk to individuals comes largely from factors external to the population, then new levels of cooperative systems can emerge, and complexity can increase.
It is a reasonable first order approximation to say that all major advances in complexity of living systems has resulted from the emergence of new levels of cooperative systems. And of course it gets complex quickly, as raw cooperation is vulnerable to exploitation, and thus evolves an evolutionary arms race between detection strategies and cheating strategies that rapidly becomes a complex ecosystem in itself (at all levels).
Markets are founded in scarcity.
If you don’t need anything, you don’t go to the market.
In most places there is no market for air, yet it is arguably the most valuable commodity for any of us.
When most things were genuinely scarce, markets and the concept of money as a general metric of exchange value were great tools that performed many valuable roles in distributing goods, information and governance.
We are now in an age of exponentially expanding automation.
We can now produce a large and exponentially expanding set of goods and services in universal abundance but there can never be any economic incentive to do so universally.
Thus the sort of value that markets and money measure (exchange value, scarcity value) is undermined by fully automated technology.
What most people want is a reasonable abundance of the things they need.
We are rapidly approaching the time when we will have the technology to deliver that – universally, yet the modes of thinking engendered by market measures of value directly work against such universality.
Markets and full automation are anathema to each other.
How we manage transition from scarcity based thinking (money and markets) to abundance based thinking, while preserving distributed systems of information, trust, production, distribution and governance are the critical questions that we have very few years left to get something workable trialed, operational, and ready for universal distribution.
Very similar Fred.
I’ve played with a few toys in my youth, pushed cars, motorcycles, boats, aircraft to limits not many approach. Pushed my body and mind towards a few too.
And even there, going there didn’t actually require a lot of resources (not in the big scheme of things).
Yeah – sure – our rational(ish) consciousness is just the driver of an elephant, and provided the elephant stays calm(ish) the driver can get it do some amazing stuff – particularly if the elephant develops a liking for what the driver wants it to do.
The numbers do seem to be achievable, and there are some provisos.
Diet needs to be largely plant based (but that is required for health anyway).
Large scale engineering (and we need quite a bit of that) needs to be done off planet by fully automated systems (which isn’t actually that much of a technical challenge once full automation is achieved).
We need to be much smarter about how we manage our interaction with the biological systems we are part of on this planet (which is entirely doable, with minimal interference to freedoms, with sufficient appropriate technology – see above).
We need effective working models of distributed trust networks and distributed governance – and again, that seems doable.
So yeah – challenges, and achievable,
And it does need to be done, and soon.
It is creating the various levels of collective will that is becoming rather urgent.
We have less than a decade to make significant progress (like achieve full automation, and have a first workable approximation to global cooperation).
Can be done, and needs significant effort applied sooner (next couple of tens of months) rather than later.