Verus and Timon argue about the new book Dataclysm, in which the author highlights the gap between stated intention and preference and actual behavior in a way that shows what an abyss there exists between the two.
Verus: The book really is an interesting corrective to any future politically correct nonsense. This is who we are, as the author says “when no one thinks we are looking”. This is who we _really_ are.
Timon: Well, that seems both a bit rich and not quite true. The book finds difference between what people say and what they do, what is the new thing in that? Sure, men’s preference for younger women, our hidden racism and other glaring inconsistencies in our thinking seem very well documented by the data, but that really does not say anything does it? What we see in all of these experiments is actually not what people do – but the rich and disturbing complexity of our emotional lives. If you look at life and society in general the evidence is rather that we are able to overcome these swirling irrationalities and act in an ethical way…
Verus: Really? I think you are far too generous to mankind. I think that while our prejudice and our almost tribal instincts have become less visible they still determine our actions, our social structures and the lives of millions. Just think about the example of the applications for jobs with names that sounded African-american vs those that sounded Caucasian. Everywhere in our society our prejudice structurally disadvantages whole groups on a routine basis.
Timon: So our shadows of data reveal our hypocrisy, you mean? I just don’t buy that. I think the patterns in the data are far from born out in the real world. I do think that we are all complex, cruel, selfish and with an infinite capacity for evil – but with the same infinite capacity for good.
Verus: That is sweet, but what data sets do you have that speak for you?
Timon: Is that where this will end? That the production of a data set now will lead to the conclusion that we need to redesign society and engineer solutions that save us from our darker angels?
Verus: Why not? Hypocrisy exposed is hypocrisy you can fight. If we use data to reveal hidden biases we will be able to articulate defenses against them! And not deny them reflexively as you seem to be doing.
Timon: I don’t disagree with that, actually. I think revealing hidden biases is not a bad thing, but all of the data misses something – and what it misses is this: society has progressively become more equal and more just over the last 500 years. And that has happened without the articulation by data, it has been driven by something else. It has been driven — dear I say it — been driven by the human capacity for good deeds, for acting in a way that is just and right. Even without data to support it. In fact, I think if anything the data risks weakening that innate human ability for acting justly —
Verus: Come on, how can reveal bias actually make people act less responsible or justly?
Timon: Because just actions flow not from an understanding of data, but from a constant examination of your own soul and your own decisions. A constant battle with yourself and with your baser instincts. And data makes it into a collective structural artifact and not an individual responsibility. I believe that the way to a more just society – even in the future – is through the constant cultivation of socratic dialogue, incessant questioning of what justice really is. Not through data analytics. With data we risk externalizing the individual ethical responsibility and we actually give people an excuse: they can now say that well, it is not my problem – it is a social problem. We need laws, not individual self-examination.
Verus: I disagree, violently, with that. Big data will reveal us as we are, and our shadows of data will give us the menu for a political reform agenda that will allow us to really end structural racism, chauvinism and other diseases of the human mind once and for all. And through nudges and social engineering we will be able to build a social machine that can help us do so – individual responsibility just won’t do. Everyone is responsible today, and look at what has happened!
Timon: Well, look at the social progress we have done without data the last 500 years. Where did that then come from?
Verus: But it has flatlined! With data we can make the same kind of progress that we could make in science once we were able quantify the object of our studies. We will now be able to do the same with mankind — social physics, as MIT-professor Alex Pentland calls it.
Timon: Socrates turned away from natural science, because he believed that the examination of our own souls is necessary for any political animal. And that cannot be quantified. I am not a data point, and you are not a pattern.
Verus: But we are. And therein lies our salvation: we can quantify the parameters of our own moral shortcomings and correct them. We can close the gap between what we say and do. This spells the start of the end of hypocrisy.
Timon: I agree we can – but never through measuring what we do and using that to change what we say. We need to start with examining what it is we say in the light of eternal ideals of justice and with on-going human dialogue.
The night fell silently as their discussion continued, but we leave them there. In human dialogue or as data points in a pattern made available through data sets? We don’t really know.