Written by David Swanson
Don’t people who are wrong annoy you? I just read a very interesting book called “Being Wrong: Adventures in the Margin of Error,” by Kathryn Schulz. Of course I read it with an eye toward figuring out how better to correct those other people who are so dangerously and aggravatingly wrong. And of course the book ended up telling me that I myself am essentially a creature of wrongness.
But if we’re all wrong, I can live with that. It’s being more wrong than other people that’s intolerable. However, statistics show that most of us believe we’re more right than average, suggesting a significant if not downright dominant wrongness in our very idea of wrongness.
Even worse, we’re clearly not wrong by accident or despite the best of intentions. We go wrong for the most embarrassing of reasons — albeit reasons that might serve unrelated purposes, or which perhaps did so for distant ancestors of ours. For example, when asked to solve simple and obvious problems that a control group of similar people has no trouble solving, a disturbing number of humans will give the wrong answer if stooges planted in the room confidently give that wrong answer first.
Even more disturbingly, measurements of brain activity during this process suggest that those giving such wrong answers actually perceive them as correct following careful consideration of the question with no particular energy expended on consideration of peer relationships. In other words, people believe their own obvious B.S., even though its been blatantly placed in their minds by a bunch of fraudsters. (I am aware of the redundancy in making this observation during what has been an election year in the United States.)
A lone dissenter in the room can change the dynamic (which perhaps explains why Fox News quickly cuts off the microphone of any guest straying from the script, why a sports announcer who denounces our gun culture must be punished, why a commentator who questions Israel’s crimes must be silenced, etc.), but why should we need someone else to dissent before we can?
Well we don’t all or always. But a disturbing amount of the time a lot of us do.
Even more disturbingly, few of us are often inclined to say we are undecided between possibilities. We are inclined toward certainty, even if we have just switched from being certain of an opposing proposition. As we are confronted with reasons to doubt, it is not uncommon for our certainty to grow more adamant. And we are inclined to greater certainty if others share it. Many of us often admire, and all too often obey, those who are certain — even about things they could not possibly be certain about, even about things there is no great value in being certain about, and even about things these “leaders” have been wrong about before.
Now, I think Schulz is wrong in her book on wrongness not to place greater emphasis on the issue of why politicians change their positions. If they do so for corrupt reasons, to please their funders, we have corruption as well as indecisiveness to dislike. But if they do so in response to public pressure and we still condemn them for indecisiveness, we are condemning representative government along with it. But there is no doubt that many people — sometimes disastrously — can be inclined to prefer the certain and wrong to the hesitant and ultimately right. A baseball umpire who’s wrong but adamant is the norm, because one who corrects himself is soon out of a job.
We begin our careers of wrongness early. If you show a toddler a candy box and ask what’s in it, they’ll say candy, completely free of doubt. If you then show them that it’s actually full of pencils, and ask them what they had thought — five seconds earlier — would be in the box, they will tell you they thought it was full of pencils. They will tell you that they said it was full of pencils. Schulz says this is because young children believe that all beliefs are true. It could also be a result of the same desire to be right and not wrong that we find prevalent in adults, minus adults’ ability to recognize when the evidence of their wrongness is overwhelming. A psychologist in 1973 asked 3,000 people to rank their stances on a scale from “strongly agree” to “strongly disagree” with positions on a range of social issues like affirmative action, marijuana legalization, etc. Ten years later he asked them to do so again and to recall how they thought they had answered 10 years prior. The what-I-used-to-think answers were far closer to the people’s current positions than to their actual positions of a decade back.
A decade back I would have told you that it might be valuable to work for progressive change within the Democratic Party. Now I’d tell you that’s counterproductive. Never mind if I was wrong then or am wrong now, or perhaps there’s not enough information in such brief statements to know whether I’m not perhaps wrong in both positions. The point is that I only know how misguided I used to be because my blog doesn’t edit itself, and I go back and read it. Not so with my brain. It edits itself quite efficiently. We have no idea how wrong we are, and much less idea how wrong we used to be. And we absolutely do not want to know.
“It isn’t that we care so fiercely about the substance of our claims,” writes Schulz. “It is that we care about feeling affirmed, respected, and loved.” This helps explain why a common response to being wrong is to make the situation significantly worse and facilitate new cases of being wrong in the future. Medical mistakes in our hospitals kill a great many more Americans than any of the commonly thought of but statistically trivial causes of death (like terrorism) or even the truly major causes of death (like automobiles). And hospitals typically respond with evasion, defensiveness, and denial.
We see this across the field of public policy. Alan Greenspan may admit the error of his ways on the way out the door. So may President Eisenhower, albeit without calling it a confession. Even Secretary McNamara may recant his love for warfare before he dies. But those vigorously pursuing careers usually avoid admitting wrongness. And those proven wrong are typically replaced with new people willing to push the identical mistaken policies.
Members of the public who support wrongheaded policies (the markets will take care of themselves; weapons spending makes us safer; global warming doesn’t hurt; the wealth will trickle down; etc.) often manage to continue with those policies despite their glaring debunking in particular instances or their recantation by particular officials. This is what I hoped to get some insight on in reading this book (as in reading a lot of books), and I don’t think I failed. (I wouldn’t, would I?)
Believers in Iraqi WMDs, when confronted with the facts, have in many cases nonsensically doubled down on their beliefs or, at the very least, continued to imagine the best intentions on the part of those who pushed the propaganda. Of course, a proper understanding of wrongness must lead us to accept the possibility that many who appear to be lying actually believe what they say. And the well-documented dishonesty, intentional fraud, and pressure on others to lie in the case of the Iraq War marketing campaign doesn’t change the fact that many who helped spread the lies believed them to one degree or another.
Dropping the WMD belief would mean accepting that respected leaders were either mistaken or lying. It would also mean admitting that hostile opponents in a very public and long-lasting debate were right. Hence the tenaciousness of those still believing that Saddam Hussein hid his massive stockpiles in a magical land somewhere.
A few lessons can be gathered, I think. One is that when we’re speaking with those who disagree, we should not refer to magical lands as I’ve just done, not mock, not gloat, not set up a hostile competition over who was right and who was wrong. Recounting previous instances of war supporters being wrong to illustrate the universality of the phenomenon could help or backfire depending on how it’s done. Ultimately it must be done if the same mistakes are not to be repeated forever. It’s certainly appropriate to demand that television networks stop limiting their crews of experts to those who have always been wrong before. Ultimately there must be accountability for the leaders of wrongness (regardless of the degree of honesty or good-intention involved). But there are those who will simply believe that Spain blew up the Maine even if they had never heard of that incident before in their lives, if you — their opponent — bring it up, even if you intend it as a comforting example of how others have screwed up too.
Clearly, focusing on the numerous times someone has themself been wrong is unlikely to help, but conveying the fact that we have been wrong too might. People should feel that they can remain or become secure, safe, respected, and loved while dropping their misguided belief, and without substituting a new zealotry in favor of another belief (even ours!) — that they can become more cautious, more willing to remain in doubt, and more willing to continue that way in the face of the certainty of others. Ideally, people should be urged toward better beliefs by a friendly and welcoming and large group of others. There’s no reason peer pressure can’t be put to good use, even while seeking to reduce its power.
More importantly perhaps, an ounce of prevention is worth a pound of cure. If we can prevent people developing attachments to lies about Syria or Iran, we will save ourselves endless headaches trying to rid them of those lies later. If we can establish not just that Iraq was unarmed but also that Iraq’s being armed would have been no justification for bombing its people, we will shift the conversation onto favorable ground. If Syria killing Syrians with the wrong kind of weapons is understood not to justify the United States killing more Syrians with the right kind of weapons, we won’t have to engage in a fast-break competition to determine and then prove whether Syria is using weapons that the United States claims it is using.
The preceding paragraph is the theme of a book I wrote called “War Is A Lie,” which I intended for war preparedness in the sense of preparation to resist common types of lies about wars. In that book, I did not follow all of the advice above. People in fact have complained to me (a small minority of readers I should say) that the book is at times sarcastic or mocking or contemptuous. In my defense, I see a value in entertaining as well as educating those already in large agreement, as well as in reaching through as powerful a manner as possible those without ossified views on the subject. But then again, there is always and forever the possibility that I’m horrendously wrong.
_______
To receive updates from After Downing Street register athttp://afterdowningstreet.org/user/register
To subscribe to other lists go to http://davidswanson.org/node/921
About the AuthorDavid Swanson is the author of “War Is A Lie” and “Daybreak: Undoing the Imperial Presidency and Forming a More Perfect Union.” He blogs at http://davidswanson.organd http://warisacrime.org.