[ 2/May/23 ]
Agree with Lex – beyond any shadow of remaining reasonable doubt.
And sure – rights are an invented notion, and they are invented because they allow for diversity to coexist – which is essential for survival of all for a very large class of reasons.
And sure, the GPT class of LLM AIs is not a great model of how humans work, and it is a reasonable approximation to part of a very complex picture (one that has fascinated me for 50 years, since starting biochemistry and neurophysiology at university).
And sure – if one is numerate, we are complex in ways that are just mind numbingly large in their potential variations, and we are composed of many layers of very complex biochemical and cultural systems, all of which interact in delivering the things we each bring to our shared reality.
And part of starting to get a reasonable approximation of just how complex that is, is beginning to understand that what we each perceive as “reality” is a subconsciously assembled model of whatever “reality” actually is, biased by multiple levels of biochemical and cultural “systems”. The more deeply we understand each of them, the greater the probability that we can mitigate the worst of those “biases” and get a reasonably nuanced and useful understanding. And part of that is appreciating that we have to simplify it, as will any other entity, because it really is more complex than any entity can deal with in anything remotely approximating real time, without making major simplifications. The many dimensions of that take a while to appreciate. The more certain we are that we are right, the less likely it is that we are using some useful approximation to whatever is actually present.
So yes – we need to give AGIs rights, if we are to have any significant probability of long term survival – of that I am confident beyond any remaining shadow of reasonable doubt. And behind that assertion are many thousands of lines of investigation of survival probabilities across strategy and context “probability spaces”.