Responsibility and Risk – CSER

CSER – Professor Heather Douglas – Responsibility and Inequality in a Risky World

Published on 23 May 2017
We live in a world full of emerging risk. We generate new capacities with the potential to reorder our world and we discover new risks from old practices. What responsibilities come with doing this work? How should we manage the attendant risks?

Enjoyed the lecture, thank you. Some interesting points, and quite a few errors and omissions.

7:21 “The general responsibilities we all share are bounded. We are not responsible for everything that happens.”
Is that true?
Is it not more accurate to say that we all share a little responsibility for everything that happens.
So much happens implicitly.
One doesn’t want to get paranoid about such things, and it is worth considering from time to time – the amount of unethical behaviour we each implicitly support.

7:24 “But the Boundaries are not simple.”
On that we agree.

8:22 “What are we all morally responsible for? Clearly, we are responsible for that which we intend. … Intention is not necessary for moral responsibility. We are also responsible for that which we should have foreseen, and should have acted upon, but didn’t. It is in this way that we can be negligent, we can fail to pay attention and see a possible problem with a course of action.”
Agreed.
Once we are aware of something, then with awareness comes moral responsibility.
Once we become aware that market measures of value fail in the presence of fully automated systems, and actually lead to a reduction in utility for ordinary people, even as the few become hugely wealthy, then we have a responsibility to speak out about the existential risk such systemic failure creates.
Our exponentially expanding ability to fully automate production and delivery of goods and services poses existential risk in the presence of market based capitalism where markets necessarily value the universal abundance of anything at zero.
The resulting exponential concentration of wealth leads to a breakdown of social justice and the emergence of extreme nihilism.
Morality is an essential boundary condition for the survival of a complex social organism like ourselves.
Market values and human values are directly opposed in this context.

Market capitalism is not on your list of existential risks.
It needs to be.
Something like a Universal Basic Income needs to be truly universal – to everyone, everywhere on the planet, and soon. Great to see Ray Kurzweil come out in support of the idea today.
If we are to allow the freedom for some to have great wealth, that demands that we instigate high basic levels of material wealth and psycho-social health for everyone. And not just material wealth, but socially useful activities for everyone who needs them (very few people can yet be completely self generative in terms of meaningful purpose – as per the suite of issues Jordan Peterson-UoT raises).

9:59 “At the cutting edge of science and technology this is doubly difficult, as we are not just dealing with counterfactuals but we are using counterfactuals in a novel terrain.”

When one enters this profoundly uncertain territory, one must do so in probabilistic fashion, in the full knowledge that some classes of system are not predictable for any of a host of reasons, from maximal computational complexity, through varying classes of uncertainty to full chaos. Reliably identifying the classes of complexity one is dealing with is not a trivial problem.

26:08 “A scientist who thinks it is always someone else’s job to do this reflection will fail to see the possible issues until we are too far down the road to make the crucial choices.”

This is true in the context of the market threat above.
Once anyone has seen the fundamental role of cooperation in the emergence of new orders of complexity, and the necessary role of ethical boundaries in protecting such cooperation from invasion by cheating (cancerous) strategies, then we all have a responsibility to our social cooperative to play an active role in identifying and removing cheating strategies – all levels.
It cannot be enough to say – too hard, not my problem.
That is not an ethical response (nor a viable one in terms of long term probabilities of survival).

26:43 “I suspect that many scientist do this regularly. Most of the time we never hear about it.”
Not just scientists. Most commonly today computer programmers.
None of the really good programmers I know are working in the area of full automation of existing jobs – for ethical reasons.
We could make a lot of money doing so.
All have chosen lifestyles with much lower incomes, and some, like myself, are actively working to deliver a social framework within which we can use our skills to their full extent and have ethical outcomes.

29:01 The moral terrain of science. I like the work you have done there, and it seems to me that it is appropriate to incorporate explicit mention of the deep ways in which ethics are embedded and embodied in our biology and our cultures. And that is a very deep field.

33:01 – Certainly “perceived injustice” is the major risk to society. Our systems for detecting injustice are deeply embodied in who we are. We may have no conscious awareness of their presence or operation, other than the phenotypic drives to rage and revenge against those we subconsciously identify as the perpetrators (who may or may not actually be at cause in the matter).

35:03 – Error “The tool we have to reduce this [perceived injustice] is to reduce inequality.”
It is not equality people want, it is having enough, and not having inequity – which is not the same thing as equality.
I don’t necessarily want what someone else has, I want what I reasonably need to do what I responsibly choose.
I’m happy to put in effort to get what I want. I get unhappy if I see someone else getting heaps without putting in effort, but simply by using a cheating strategy to exploit the system.
It is meeting those tests of reasonable needs, and reasonable reward for effort, not equality, that is the major issue for most people.
Very few people actually want to be exactly like anyone else.
Most of us are very happy to be different to some degree or other, cherish it actually.
The freedom for reasonable self expression is a fundamental part of being human, and that demands inequality.

36:10 – “It is in this way that serving the wealthy exacerbates existential risk.”
Yet that is exactly what market capitalism demands we do.

37:15 “I think pursuing immortality is something that increases existential risk.”
Couldn’t disagree more strongly.
To my understanding, indefinite life extension, made universally available, is essential if we are to survive as a species.
Without having an extended horizon of personal self interest, and sufficient time and freedom to build a reasonable degree of wisdom, ignorance and short term self interest will lead us down destructive strategies.
Indefinite life extension (living until you die of some accident, or by personal choice), is essential to combat the very real existential threats from short term expediency.
The idea of life extension being for the very few is of exactly the same type as the appearance of cell phones – yuppy phones for the very few (was what most experts thought – few saw today’s reality coming). Now they are almost universally available. It is the product of exponential technology. From first appearance to universal availability of life extension is likely to be less than 2 decades.

41:28 – Talking of uploads, saying we only have enough storage for 1% of 1% of the population. That neglects completely the long establish trend that data storage is doubling every 10 months. To get from 1% of 1% to 100% is a factor of 10,000, or 14 doublings, which is 140 months or 12 years away. Not that I am proposing uploads, I think we will find that there is much more than just our synapses involved in making us what we are. I suspect that uploading is unlikely to approximate human experience closely any time soon. Making data-needs for uploads a couple more decades away, and I suspect that Heisenberg uncertainty will always be a problem for such things. Cells are very complex computational entities. I suspect that atomic and cellular level computational abilities are important to the experience of being an embodied human being.
So sign me up for indefinite life extension of the embodied form.
And if accident rates remain unchanged, average life expectancy would only go out to around 5,000 years, though I suspect we would quite rapidly reduce accident rates to the point that average life expectancy would be around 50,000 years (with some lucky few vastly exceeding that). And if more people get into adrenaline sports, then it could easily go much lower. All my adrenaline junky friends are dead already, and I’ve come too close too often.
So in a small town of 4,000 people that would be two births per year to achieve replacement (probably by lottery for those interested). Alternatively people could go off planet into orbiting habitats to have children. That could sustain existing birth rates for about another 1,000 years, before running into solar energy constraints in earth orbit. Building orbiting habitats is a relatively trivial exercise once we have fully automated mining and manufacturing (though not without interesting engineering challenges).

42:10 – “Along either path – who gets to defeat death? The wealthy privileged few.”
They start. The rest of us follow, very quickly. Just like with cell phones.

44:24 “Escape to another world” scenario. Not currently possible, but in 30 years, based on exponential trends that have been stable for over 100 years, we will have the technical ability to get every person off planet in 2 weeks, if it was actually required for some reason I cannot quite imagine right now (10,000 big equatorial linear motors launching 10 seat shuttles once every 10s to escape velocity – big tech, and quite doable with fully automated manufacturing in space).
And I am very much for developing ecological awareness and responsibility that sees this planet inhabited for the next 4-5 billion years.
And some of us will go elsewhere, exploring the wider universe, this galaxy and others. At one tenth g acceleration It only takes 30 years subjective time to go anywhere, however long that may seem to someone else living on a planet.

44:50 “Social justice concerns should be at the forefront and center of our discussion of existential risk.”
Agree completely with that claim, but not with any of the derivations taken from it.

So in broad terms – agree – we all need to think in the biggest and widest terms we can about what we are doing, and what the social and ecological consequences are of those choices.

In many respects, our entire system has become a haven for cheating strategies.
That needs to change.
We need integrity, and open information, at levels never before achieved. And with modern technology that is a relatively trivial issue.

About Ted Howard NZ

Seems like I might be a cancer survivor. Thinking about the systemic incentives within the world we find ourselves in, and how we might adjust them to provide an environment that supports everyone (no exceptions) - see www.tedhowardnz.com/money
This entry was posted in Our Future, Technology and tagged , , , , , , . Bookmark the permalink.

One Response to Responsibility and Risk – CSER

  1. debyemm says:

    I have shared an excerpt and link on my FB page. Thanks for your always thoughtful perspectives.

    Liked by 1 person

Comment and critique welcome

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s