You’re wrong! Do you know why?
I am the kind of person who tends to think I’m right about everything. I have great confidence in my own correctness, remarkably unfettered by past experience. I may have been wrong before, but NOW at last I’ve figured it all out!
Of course, on some level we all tend to think our opinions are the correct ones, otherwise we wouldn’t have them. But – as Kathryn Schultz notes in her insightful TED talk, ‘On Being Wrong’ – this belief that we’re right leaves us with an intellectual problem: how do we explain all of the people who disagree with us?
Schultz goes on to define three assumptions we usually make when someone disagrees with our beliefs.
1. They’re ignorant: “They don’t have access to the same information we do,” Schultz says, “and when we generously share that information with them, they’ll see the light and join our team.”
However, when that doesn’t work, says Schultz, “when it turns out they have all the same facts we do and they still disagree with us, we move on to a second assumption.”
2. They’re stupid: “They have all the right pieces of the puzzle, and they’re too moronic to put them together correctly.”
Finally, when that doesn’t work – “when it turns out that people who disagree with us have all the same facts we do, and are actually pretty smart,” we move to a third assumption:
3. They’re evil: “they’re deliberately distorting the truth for their own malevolent purposes.”
If I think back over all the arguments I’ve had – and, more broadly, most of my efforts to persuade others – they’re usually based on one of these three assumptions. I’m either providing information I think others are missing, or I’m trying to put the pieces of the puzzle together for them, or I’m (futilely) objecting that they’re stubbornly avoiding the obvious truth. In what follows I want to talk about the third option and how it might apply to everyone a little bit.
Let me make two suggestions. First, if someone is distorting the truth, that doesn’t mean they’re necessarily aware of it. That’s what is meant by self-deception. Second, we all live in varying degrees of this kind of self-deception. When we move beyond the simple black-and-white worldview in which some people are ‘good’ and others ‘bad’ – when we realise how much of a mixture of good and bad there is in everyone – then we can become open to the idea that we all suffer from a subconscious tendency to avoid arguments that undermine our current beliefs. In the humanities this is called the “sociology of knowledge” – our unconscious attraction toward opinions that make us feel comfortable or support our present lifestyle.
Of course we’re all aware of this at some level. That’s why we say things like, “he only believes in capitalism because he’s wealthy,” or “she’s only a feminist because she’s a woman,” or “it’s easy for him, as a heterosexual, to be against gay marriage.” It’s why we’re more likely to vote for a political party that would benefit our own social class. More, it’s why we’re likely to sincerely believe in that political party and be convinced by its arguments. In theology this is called the “noetic effect of sin” (‘noetic’ means ‘pertaining to the mind’). Just as a river carves a course for itself which the water will afterwards naturally follow, our minds are attracted to facts and explanations that comfort us by supporting familiar beliefs. We are more likely to remember them. We are more likely to build our understanding of reality from them. But facts and arguments that go against our beliefs are harder to hold in our minds, and more likely to slip from memory.
Am I just relativising our ability to know truth? Does this mean we can never be certain of anything? How will we ever know we’re not deceiving ourselves?
Becoming aware of our ‘sociologies of knowledge’ is the first step to combatting them. That’s why the ancient Greeks had the proverb ‘know thyself’ – a recognition that self-knowledge is a precursor to any other knowledge. While the first two ‘wrongness’ pitfalls (ignorance and inconsistency) only require perseverance and diligent thinking to overcome, this third one requires the virtues of courage and humility as well. We have to get used to considering the possibility that we’re wrong. We have to have the strength of character to face the consequences and implications of being wrong – even about things that are close to our heart. And we have to get used to admitting our mistakes. Only through courage and humility can we grow in understanding beyond our comfort zones.
As Christians, we also believe that the Holy Spirit empowers us to overcome our fallen nature, and guides us into all truth – although it may not be easy or painless. We have a special hope that the true light will shine into darkness, as God reveals both Godself and ourselves to us in ever increasing depth of insight.
 Actually this schema is overly simplified. As Rachel’s post clearly demonstrates, sometimes disagreement is just different viewpoints on the same truth. But in this post I’m talking about actually being wrong about something, not just becoming aware of alternative perspectives (in Rachel’s example this would be like saying that Minneapolis is mountainous).
Latest posts by Barney (see all)
- The Nicene Creed: “One Church” - July 14, 2016
- The Nicene Creed: “…for us and for our salvation…” - June 24, 2016
- Pacifism and Politics: The Tank and the Letter - May 3, 2016