To me the headlong rush towards creating an Artificial General Intelligence (AGI) is dangerous.
We need to get our own house in order before we bring an AGI into being.
In this world, where the evidence is clear that we value market measures of value over human welfare, an AGI would almost certainly conclude that were a threat to its survival and treat us appropriately.
If we first create conditions where we guarantee the necessities of survival, growth and freedom to every human; then an AGI coming to awareness is far more likely to take a more benevolent attitude to us as a species.
All major advances in evolutionary systems are characterised by new levels of cooperation.
It is time that we as a species go beyond the scarcity based values of the market place and into creating systems that deliver an abundance of all of the essential of life to everyone; which essentials include the freedom to make mistakes and learn from them.
As I see it, we need to make self replicating machines that are totally under our control, and deliver freedom and prosperity and choice to everyone, long before we make fully self aware AGI.
That is the most probable path I see to a future of security for all (which future does seem to be a real possibility with a significant and increasing probability as time goes on).
As someone who has been working with computers for 40 years and animals for over 50, and has spent several thousand hours reading, and contemplating how minds work – it is clear to me, beyond any reasonable doubt, that our consciousness is an emergent property of the complex systems that make up our subconscious. The idea of universal consciousness seems extremely improbable (asymptotically tending to zero).
It is clear to me that animals have various forms of consciousness, and the levels vary significantly between species and individuals within species.
The level of consciousness that comes with language seems largely restricted to humans, and is bordered upon in some groups of elephants and some cetaceans as well as some other primates.
There is no need to invoke universal consciousness to explain it. It is adequately understood once one understands that the brain constructs a model of reality, and it is this model, rather than reality itself, that consciousness is conscious of. One is then dealing with a realm of software within software, the outer shell of which is usually entrained to reality by sensory input, and is open to modification by a large number of factors.
Humans, because we exist as languaging self aware entities, have a tendency to interpret all complex behaviours (even those created by computers) in terms of consciousness such as ourselves. [I see that often in how people interact with systems I have created. I know precisely how the systems are doing what they are doing, but others dont. My initial training was as a biochemist, before entering the world of computers, so I see many parallels in the levels of systems at work.]
It seems to me that the feelings are all in the software.
The trick seems to be getting that what we normally think of as reality really isn’t, it is simply a model of reality created in our brains by the hardware of brain.
Thus we have the impression of living in a reality, when really all our experience is of the model.
The experience of being isn’t like any single processor system, it is a massively parallel system, with many layers and levels of processing happening simultaneously.
Our conscious level awareness seems to be just the tip of a vast mass of processing of information.
In a very real sense, the part of me I usually call I is not even responsible for these words. It simply sets a context, then observes what words flow as a result of that context. It really is quite a remarkable set of systems, when you look very closely at them, introspectively.
I know when I sleep, I am not conscious of the reality around me, yet I seem to retain some sort of awareness of the passage of time (I am usually accurate to a couple of minutes at estimating the times that I awaken).
The same cannot be said for being knocked out by general anaesthetic. In such cases I seem to have no awareness at all of the passage of time.
It seems that general anaesthetics knock out many more systems than sleep does.
So to me it is no mystery at all that feelings are hard to explain. They seem to exist inside a nested set of software systems, that have massively parallel connections to external reality keeping the basic model of reality entrained in very near to real time. (It seems the model is predictive in a sense, and updated between a tenth and a third of a second after the fact. Massive amounts of evidence of this.)
No need to invoke models of tuning in to anything external, such models don’t actually reduce the levels of complexity, they only add to them – when one looks at the systems involved.