Comment on Will’s facebook page – http://www.theguardian.com/science/brain-flapping/2014/sep/12/comment-sections-toxic-moderation
Any tool is morally neutral – the more powerful the tool, the more damage can be done. This is true in any and all dimensions.
The issues become around morality and awareness.
How conscious are individuals of their relationships with each other, and the environment we inhabit?
How much do we value the life and liberty of ourselves and others?
How conscious are we of the consequences of our actions on others?
How much time and energy do we as individuals put into estimating those consequences, and mitigating any negative effects?
I seriously oppose moves to limit our capabilities to the lowest common denominator. We need tools and awareness.
I agree that we need to put much more effort into considering the consequences of our choices on others, and until we as a society choose to put the value of individual life and individual liberty above any value derived from any market (aka money and capital), we are all at risk. So long as we as a society are prepared to put monetary values on human life, we are all at risk.
So agree we need to foster awareness, at all levels, and to have respect for the freedoms of others, and the impact of our choices.
And I like the idea of distributed trust networks, empowered by technology, something like an extension of your tag cloud system; and it will need a mix of Bayesian inference and experience based heuristics – all probability based.