Mistakes Were Made
What's it about
Ever wondered why it’s so hard to admit you were wrong? Learn the surprising science behind why your brain protects you from facing your mistakes, and discover how this instinct, meant to help, often holds you back from growth, happiness, and stronger relationships. This summary unpacks the powerful force of cognitive dissonance and self-justification. You'll see how these mental biases shape your memories, create blind spots, and fuel conflicts. Gain practical tools to break the cycle, start learning from your errors, and build a more honest, authentic life.
Meet the author
Dr. Carol Tavris is a renowned social psychologist and a fellow of the American Psychological Association whose work bridges the gap between psychological science and public understanding. Her distinguished career, including co-authoring the acclaimed textbook "Psychology," has been dedicated to debunking pseudoscience and exploring the mechanisms of human self-justification. This lifelong focus on cognitive dissonance and our powerful need to feel right, even when we are wrong, provided the foundational research and critical insights for her groundbreaking book, "Mistakes Were Made but Not by Me."

The Script
Think of the last time you were proven wrong about something you felt certain about. Did you feel a sense of relief, a gratitude for being corrected? Or did you feel a sting, a flush of embarrassment, a sudden urge to explain why your initial belief was still somehow valid? Most of us feel the latter. We treat being wrong not as a neutral event, like discovering a typo in a document, but as an attack on our intelligence and character. This reaction is so universal, so automatic, that we barely notice it. We construct elaborate fortresses of justification to protect the ego, brick by brick, until the original mistake is completely walled off from view. This is a feature of the human mind designed for consistency, not for truth. Our brains are built to be right, not to be accurate.
This powerful, often invisible, force of self-justification is the central subject that social psychologists Carol Tavris and Elliot Aronson have spent their careers investigating. They observed this pattern not just in petty arguments, but in the highest echelates of power, in courtrooms, in therapy sessions, and in their own lives. They saw how this psychological mechanism explained how good people could do bad things, how smart people could cling to foolish ideas, and how entire societies could march confidently in the wrong direction. Tavris, a renowned writer and researcher known for her ability to translate complex psychological concepts for a public audience, teamed up with Aronson, one of the most influential psychologists of the modern era, to write a definitive account of this phenomenon. They wrote "Mistakes Were Made" to give a name to this universal experience and expose the hidden engine of cognitive dissonance that drives so much of our behavior, from the trivial to the tragic.
Module 1: The Engine of Self-Justification
Imagine you're a heavy smoker. You know smoking causes cancer. This creates a mental clash, a state of psychological tension the authors call cognitive dissonance. You have two conflicting ideas: "I am a smart, sensible person" and "I am doing something that is demonstrably foolish and harmful." To resolve this discomfort, you have to change something. Quitting is hard. So, it's often easier to change your belief. You start justifying. "It helps me relax." "The science isn't totally settled." "My grandpa smoked and lived to be 90."
This is the engine of self-justification in action. It is about lying to ourselves, and we are incredibly good at it. The book reveals that self-justification is an unconscious process to reduce the mental discomfort of cognitive dissonance. You aren't consciously trying to deceive yourself. Your brain is automatically working to restore internal consistency, to make your actions and your self-concept align.
A classic example from the book involves a doomsday cult. The members sold their possessions and quit their jobs, convinced a spaceship would rescue them before a global flood. When the spaceship didn't arrive, they didn't admit they were wrong. That would create unbearable dissonance. Instead, their leader announced their faith had saved the world. The members didn't just accept this; they became even more fervent, going out to proselytize. They had to justify their immense personal investment. The greater the sacrifice, the stronger the need to believe.
This brings us to a crucial insight. The more we invest in a decision—whether it’s time, money, or effort—the stronger our need to justify it. Think of a founder who has poured years into a failing startup. Or a manager who hired someone who turned out to be a poor fit. Admitting the mistake feels like admitting that all that effort was a waste. So, they double down. They pour more money into the startup. They find reasons to praise the underperforming employee. This is the "sunk cost" fallacy in action, powered by dissonance.
So what happens next? This process isn't static. It creates a path. Self-justification creates a pyramid of choice, where small initial decisions lead to increasingly extreme positions. Imagine two students, both of whom believe cheating is wrong. They both face a temptation to cheat on a major exam. One gives in, the other resists. The one who cheated will start to soften their stance on cheating. "It wasn't a big deal." "Everyone does it." The one who resisted will harden their stance. "Cheaters should be expelled." They started at the same point at the top of the pyramid. But with one small decision, they began sliding down opposite sides. By the bottom, they are miles apart, each viewing the other as morally incomprehensible. This is how smart, decent people can find themselves defending actions they once would have condemned.