Wednesday, May 4, 2011

Tavris and Aronson’s Mistakes Were Made (But Not by Me): Reading Notes

A book arguing for the power of the concept of cognitive dissonance to explain “why we justify foolish beliefs, bad decisions, and hurtful acts” lacks one thing: a defensible explanation of what cognitive dissonance is.


The following is not a review but merely a comment on one particular point in the book Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts by Carol Tavris and Elliot Aronson,1 namely its failure to explain the concept and the associated theory that are the central theme of its argument. I ought perhaps to mention at this point, since you might think otherwise upon reading what follows, that I found the book immensely instructive and disturbing in a potentially very salutary way. Its strength lies in its description and analysis of the various ways in which our need to feel justified in what we think, say, and do drives us to think, say, and do wrong and harmful things. Its weakness lies in its failure to explain the rubric under which it does this work of description and analysis, the concept of cognitive dissonance.

* * *

Before I read this book, I was acquainted with the term “cognitive dissonance” but had only a rather vague notion of what it means. Having read the book, I have a better idea of what it means, and of the psychological research that is associated with it; but the book contains no satisfactory explanation either of what cognitive dissonance is or what cognitive dissonance theory is. The authors repeatedly say that cognitive dissonance theory predicts this and cognitive dissonance theory predicts that, but they never tell us what the theory is—an omission that diminishes not only the usefulness of their book but also the credibility of their argument. We cannot make any informed judgment of the value of the theory if we are never told what it is, but told only of its alleged predictive successes.

Aronson and Tavris offer an explanation of the term “cognitive dissonance” at one point; but it is quite inadequate. It occurs just after an account of the researches of social psychologist Leon Festinger and his collaborators on the response of the followers of a pretended seer, one Marian Keech, to the failure of her prophecy that on a certain date a spaceship would come to rescue them before the earth would be destroyed.2 One might suppose, if one has not previously observed how the adherents of such prophecies behave when confronted with the failure of them, that the followers would be disillusioned and see that their faith in Mrs. Keech was misplaced. But Festinger, the authors report, made a more nuanced, specific, and, as it transpired, more accurate prediction:
The believers who had not made a strong commitment to the prophecy—who awaited the end of the world by themselves at home, hoping they weren’t going to die at midnight—would quietly lose their faith in Mrs. Keech. But those who had given away their possessions and were waiting with the others for the spaceship would increase their belief in her mystical abilities. In fact, they would now do everything they could to get others to join them. (12)
At the end, the authors observe, “Mrs. Keech’s prediction had failed, but not Leon Festinger’s.” They then move on to the theory to which they credit this prediction—the theory of cognitive dissonance. They write:
The engine that drives self-justification, the energy that produces the need to justify our actions and decisions—especially the wrong ones—is an unpleasant feeling that Festinger called “cognitive dissonance.” Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent, such as “Smoking is a dumb thing to do because it could kill me” and “I smoke two packs a day.” Dissonance produces mental discomfort, ranging from minor pangs to deep anguish; people don’t rest easy until they find a way to reduce it. In this example, the most direct way for a smoker to reduce dissonance is by quitting. But if she has tried to quit and failed, now she must reduce dissonance by convincing herself that smoking isn’t really so harmful, or that smoking is worth the risk because it helps her relax or prevents her from gaining weight (and after all, obesity is a health risk, too), and so on. Most smokers manage to reduce dissonance in many such ingenious, if self-deluding, ways. (13)
The authors cite the pair of thoughts “Smoking is a dumb thing to do because it could kill me” and “I smoke two packs a day” as an example of “two cognitions that are psychologically inconsistent.” But is there any inconsistency at all between these two thoughts? Certainly they are not logically inconsistent: it is possible for both to be true. Nor is there any kind of probabilistic conflict between the two: it does not defy probability that both should be true. The authors say, in the paragraph immediately following the one just quoted, “Dissonance is disquieting because to hold two ideas that contradict each other is to flirt with absurdity . . .” But there is no contradiction between the two cognitions in the example.

The authors say that the two cognitions are psychologically inconsistent. But what is that supposed to mean? That no one can affirm both thoughts at the same time? But surely people can do so; if they could not, then this pair of cognitions could not be an example of cognitive dissonance! Wherein, then, is the “psychological inconsistency” supposed to consist? Perhaps in the fact that affirming both thoughts creates discomfort? But the discomfort was supposed to be the effect of a so-called psychological inconsistency. If the so-called inconsistency is nothing other than the discomfort itself, then the definition amounts to saying that psychological dissonance is the state of tension that occurs whenever a person holds two cognitions that produce a state of tension—which tells us essentially nothing.

It is a dismal failing for a book to give no satisfactory explanation of the very concept that is at the core of its argument. We are left to figure out for ourselves what the concept is from the evidence of the use that the authors make of it.

One point about the concept that is clear is that it has an immediate bearing on the common human proclivity for self-justification. It is, in fact, supposed to provide the answer to the question implied by the book’s subtitle: “why we justify foolish beliefs, bad decisions, and hurtful acts.” We justify, or attempt to justify, such things because it is difficult for us to accept that our beliefs have been foolish, our decisions bad, or our acts hurtful. It is surely these negative evaluations of ourselves that are the source of the discomfort of which the authors speak. In the example quoted above, there is, as I said earlier, no inconsistency between the thoughts “Smoking is a dumb thing to do because it could kill me” and “I smoke two packs a day”; but the combination of those thoughts entails the thought “I do a dumb thing.” That implication, and not any inconsistency between the first two thoughts, is the source of our discomfort. To reduce dissonance, we must do things, or rather think things, that will allow us to avoid accepting that conclusion.

It seems to me that all of the examples discussed by the authors fit under this explanation of the concept better than they fit under the explanation that they give. Marian Keech could not give up the idea that she had visionary powers because she had built so much of her understanding and evaluation of herself upon that idea. Her most devoted followers could not give up that idea precisely because they had devoted themselves to her in quite costly ways: to admit that their faith in her was misplaced would be to admit that they had been extravagantly foolish.

Further, it is evident that many cases that fit under the authors’ definition will not illustrate what they mean by cognitive dissonance. Suppose, for instance, that I remember distinctly, or seem to remember distinctly, leaving a book in a certain place a short time ago, but that when I return to that place, I don’t find the book there (and suppose also that I am alone in my room when this has gone on). This may cause me perplexity, consternation, irritation, frustration, and other unpleasant emotions, but it will not give rise to what Aronson and Tavris seem to have in mind when they use the term “cognitive dissonance.” Certainly it will not drive me to try to explain the non-appearance of the book in self-justifying ways. Rather, my reaction will most likely be first to look around to see if the book has fallen down somewhere, and then, if that does not lead to the discovery of it, to conclude that my memory is at fault: I must have put the book somewhere else and forgotten doing so. Yet here we clearly have a case of discomfort produced by an inconsistent pair of cognitions—“I left the book right here (and no one else has been around to move it)” and “The book is not here.” There is no cognitive dissonance involved because the conflict between these two cognitions does not, or does not seriously, threaten my evaluation of myself. It does compel me to acknowledge the faultiness of my memory, but it will not be the first thing to have done that.

In sum, what the authors talk about under the heading of “cognitive dissonance” is not, as they say in their attempt at a definition of the term, an inconsistency between two cognitions, but an inconsistency between some body of cognitions and our estimation of ourselves.

17 June 2012: Correction made in penultimate paragraph: “will not give rise to” replaces “would give rise to.”

* * *

After writing the comment above, I came across the following passage in the Wikipedia article “Cognitive Dissonance”:
An overarching principle of cognitive dissonance is that it involves the formation of an idea or emotion in conflict with a fundamental element of the self-concept, such as “I am a successful/functional person,” “I am a good person,” or “I made the right decision.”
I wish that I had a better source for the attribution of this principle to the concept or the theory of cognitive dissonance than Wikipedia, but as far as it goes, it confirms the argument that I developed independently. What puzzles me is that something so obviously important would fail to make its way into the argument of Mistakes Were Made. Elliot Aronson, also according to Wikipedia (the article on him), “is listed among the 100 most eminent psychologists of the 20th Century,” “is the only person in the 120-year history of the American Psychological Association to have won all three of its major awards: for writing, for teaching, and for research,“ and “in 2007 . . . received the William James Award for Lifetime Achievement from the Association for Psychological Science.” Why he and Carol Tavris failed to include this essential point in their exposition—which is virtually a non-exposition—of the central concept of their book, I do not know, but the fact that they did so confirms my suspicion that sloppiness in the handling of crucial concepts is very common in the discipline of psychology.


BIBLIOGRAPHICAL REFERENCES

1Carol Tavris and Elliot Aronson,  Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts (Orlando, etc.: Harcourt, 2007)

2Leon Festinger, Henry W. Riecken, and Stanley Schachter, When Prophecy Fails: A Social and Psychological Study of a Modern Group that Predicted the End of the World (Minneapolis: University of Minnesota Press, 1956).

5 comments:

  1. Thanks for the excellent discussion. I too find it shocking how common it is to find serious conceptual slippage and logical sloppiness in the thinking of professional psychologists. And it is dismaying, also, that poorly defined techinical terms, such as this business you've pointed out about 'cognitive dissonance' in the book, are deployed not just in academic theoretical discussions of human behavior, thought, and feeling, but to "diagnose," and hence serve as a ground for "treatment" of actual, flesh and blood, human beings.

    ReplyDelete
  2. A remark on psychological inconsistency: “Smoking is a dumb thing to do because it could kill me” and “I smoke two packs a day” are certainly not logically inconsistent under interpretation. But I think the vague notion of "psychological inconsistency" is best read as a sort of akrasia. A Davidsonian reading would say, that they imply "inconsistent" preferences, which can be seen if we add the proposition, "I, ceteris paribus, want to lower the probablity of myself dying." (and we generally attribute this proposition to people).

    But then again, preferences, by defition, cannot be inconsistent - they can only have an "unusual" transitive ordering, in this case the ordering would state that I prefer smoking to the increase in the probablity of myself dying.

    And I think the "discomfort" can be seen exactly as an effect of imagining having to justify this ordering of preferences and the "inconsistency" (and yes, this a vague use of the term "inconsistency") can be seen as our failing to defend this ordering.

    ReplyDelete
  3. So, I'm a tiny bit late commenting on this. But I don't think the inconsistency intended here is a logical one. I think it is grounded in other beliefs/desires a person holds, and can be thought of in light of the practical syllogism: if I believe cigarettes are harmful to me and I desire not to be harmed, the rational conclusion is that I should not smoke. If I am smoking, there is a contradiction between the action normatively prescribed by my beliefs/desires (not smoking) and what I am doing (smoking); they cannot fit into the same practical syllogism. So, I can revise my actions to be consistent with my beliefs/desires or revise my beliefs/desires to make my current action rational.

    I don't think that needs to involve self-evaluation, although it can, and I do think it can be about maintaining a consistent self-narrative. I suppose my account above makes it sound more about maintaining a rational self-narrative.

    ReplyDelete
  4. Thanks for visiting and commenting, JG. I still have an unfinished draft of a further discussion of cognitive dissonance that I abandoned about a year ago when I left off posting here. Getting a comment makes me think of returning to work on it.

    Your analysis seems to me entirely defensible as a way of presenting the notion of cognitive dissonance; but the fact remains that it does not agree with the exposition given by Tavris and Aronson. In the passage that I quote, they flatly assert that in their example the cognitive dissonance obtains between the thoughts “Smoking is a dumb thing to do because it could kill me” and “I smoke two packs a day.” I like your account much better, but that is mainly because you are not guilty of the sort of ineptitude and logical confusion that vitiates their account. After writing this post I started reading Festinger’s A Theory of Cognitive Dissonance, which uses the same example, and found his account to be equally incoherent. You would think that with 50 years of further discussion, a psychologist who is a leader in his field would become aware of these logical errors and correct them; but apparently he is content simply to repeat them.

    ReplyDelete
  5. Fair enough--I haven't read their book.

    I suspect people probably often intend something like that when talking about dissonance, if they were pushed to be clearer about what they really mean, but I could be mistaken.

    ReplyDelete