My husband is a criminal prosecutor, and he sometimes talks about the problem of honest but mistaken witnesses; witnesses who sincerely believe that the testimony they’re giving is accurate, but which is in direct conflict to the empirical evidence. People who become convinced that they saw something that they could not have seen, and who become so invested in that incorrect recollection that they will make increasingly far fetched claims to support it. For example: a guy who claims to have witnessed a fight between the victim and the accused, who has absolutely no personal stake in the outcome of the case, yet refuses to identify himself on video that proves he was not in the area at the time of the fight. and even denies having ever used the brand of car that he’s on camera driving… until google street view shows that same car parked in front of his house.
They’re not malicious or stupid or crazy; they’re just invested in the accuracy of information that’s not quite correct – convincingly similar to the truth, but just not quite true. Like the time my parents and I had a blow out argument about whether the steak we had in Paris was served with a pesto sauce: we were both completely convinced of our own memories, despite that they were mutually exclusive, and we were both increasingly annoyed that our own recollection was being doubted despite that the issue was of absolutely no consequence.
I think this is a good metaphor for how to approach political disagreements most of the time: the person who disagrees with you is likely not malicious or crazy or stupid, they might just be incorrect. It’s also worth considering that sometimes you might both be somewhat wrong, and it’s conceivable that sometimes you might be the mistaken party.
In my final year of undergrad, Professor Chris Hedges insisted that we engage with people with whom we disagree and the people we were studying in order to be able to engage with those people on their own terms. I found it to be a really useful exercise, so after graduating I started watching Fox & Friends every morning as I was getting ready for work. It became really obvious really soon how easy it would be to get sucked in to the propaganda by just passively watching that show every morning before your first cup of coffee. I came to think of it as my morning critical thinking workout; having to actively identify the fallacies in their arguments while also trying to do things like pack lunch and find matching socks.
Ever since then, I’ve felt a real sympathy for the people who buy into that worldview – not in a condescending way, but just because I came to understand how easy it would be to buy into information that very often sounded very convincingly like the truth. However cynical the motives of the people producing this propaganda, it wasn’t poorly produced and it would be relatively easy to be lulled into believing it. And, as I explained above, once we become invested in the truth of something, we’re super resistant to someone telling us that we’re mistaken.
The thing is: it’s already hard enough to have conversations with “honest but mistaken” people without calling them idiots or crazy or evil. And despite that trying to adjust a person’s entire worldview is definitely almost never the best use of your energy, it’s nevertheless a worthwhile to remember that people can be wrong without being dumb, demented, or bad. If only because it makes it a lot easier to at least preserve one’s faith in humanity and at best to preserve friendships with people with whom we have disagreements.
Posted in: Progress