Monday, November 29, 2010

Mistakes Were Made (but not by me)

I wish I had read Mistakes Were Made (but not by me), by Carol Tavris and Elliot Aronson before I began my ministry. Unfortunately it wasn’t written yet. It was written in 2007 and I am a little disappointed that I didn’t discover it until the end of 2010. I stumbled on it the way I stumble on many books. Our Directing Pastor here at University, Rev. Charles Anderson has a much healthier relationship with books than I do. He reads them, and when he is done, he gives a whole lot of them away. This one was sitting on the table of books outside his office that are up for grabs. He had mentioned reading the book in a conversation we had sometime back but it was really the subtitle that hooked me, “Why We Justify Foolish Beliefs, Bad Decision, and Hurtful Acts.” It seemed there was a book here that attempted to take on the questions that plague me, “Why do people spend so much energy believing things even when the evidence against them is overwhelming?” or “Why do people continue to believe that something they are doing or have done is right even when all signs point to the fact that it is indeed not?” or, more simply, “Why do people get so entrenched that they just can’t change their mind or admit that they are wrong?” The authors answer quite simply and then quite deeply, “cognitive dissonance” and our need to reduce it. “Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent…” (p. 13) This dissonance makes us very uncomfortable and our brains do everything they can to reduce the tension is that is where we run into problems. It seems at some point we commit to a decision and the further we go down the road, the harder and harder we work to justify it. And our brains help us. Once we have arrived at a conclusion, our minds tend to filter out contradictory information (that would cause dissonance) and amplify information that justifies our conclusion.

I am so tempted to give some examples of well known situations of cognitive dissonance in popular culture but, if I did, I would get angry emails. But we all know someone who holds and opinion that we know to be just completely absurd. And we all know someone for whom that opinion is so firm that even trying to share conflicting data is an exercise in futility. The more dissonant the information, the harder they will work to disprove it. So how do people who are otherwise rational get to places of holding irrational views? One step at a time. The authors do a wonderful job of explaining how we take small steps with small incremental justifications until we are so far away from the rational that we just can’t get back there.

I don’t have the time nor the expertise to give this book a full treatment but it is worth reading for chapter 3 alone. Chapter 3 is about our memories, how unreliable they actually are, and how they feed our need to reduce dissonance to recalling most vividly things that fit into the frameworks we create. This might be the most useful part of the book for life in the church. Memory, both individual and corporate is one of the powerful forces at work in the life (or death) of a church. A better understanding of how we process the past might go a long way in our effort to move toward the future.

If you work in the church or just love the church, this might be very well worth your time. You will likely see not only see the behaviors of others in a clearer light, but your own transgressions as well.



blog comments powered by Disqus