This is a summary of the core concepts in the #1 New York Times Best Selling book Think Again by Adam Grant. The book provides a lot of compelling examples that allow for a deeper understanding of these principles that will not be included in the summary, so if you like the concepts introduced, I highly suggest reading the book. I personally learned a lot from this book and how I’ve been cognitively lazy so I hope you do too.
“Progress is impossible without change; and those who cannot change their minds cannot change anything.”
George Bernard Shaw
Part 1: Individual Rethinking
The book begins by introducing four different mentalities around how a person may treat knowledge and learning.
The Preacher: The preacher mentality treats their knowledge as facts, even when evidence doesn’t support it. They have decided that what they know is correct just because they believe it to be right, but they don’t care if their beliefs are actually true, they are more concerned with preaching to others about how right their beliefs are.
The Prosecutor: The prosecutor mentality is most concerned with telling other people how they are wrong. They are the fact checkers, but not for the sake of truth, more for the sake of their own ego. They thrive on proving people wrong, not on finding out what is right.
The Politician: The politician mentality combines the preacher and prosecutor mentalities by not only claiming that their opinion is fact, but that everyone else’s opinion is wrong. Their combination of charisma and prowess on attacking opposing viewpoints makes others follow them on blind faith and by mutual agreement that the opposing side sucks and that views held by the opposing side are automatically incorrect as a result.
The Scientist: The scientist mentality is the pursuer of truth. Their focus is not on needing to be right and/or attacking the other side like the other mentalities, their goal is accurate knowledge. They will admit their lack of knowledge and switch viewpoints on a dime when presented with evidence that disconfirms their current belief system.
This book is about how to treat learning like a scientist does. A scientist is constantly examining and re-examining her beliefs. A scientist recognizes that what she was told by her friend in 6th grade may be inaccurate and understands that when it comes down to it, the amount of knowledge that she doesn't know far exceeds, exponentially, the amount that she does know. A scientist shows wisdom in knowing how little he (yes I switched genders, done intentionally) knows and rethinking assumed knowledge. The book’s main message is that “being a scientist is not just a profession. It’s a state of mind.”
Among other examples in the book there are a few that I will share to illustrate how taking a scientific mindset is advantageous. The first is an example of forest firefighters that were caught in a wildfire. The firefighters were running, with heavy gear on their backs, to avoid being incinerated by the rapidly accelerating fire, but only one person thought to drop their gear so that they could run faster. Forest firefighters are taught to treat their gear with reverence as it may be the difference between life and death, but when running for their lives their gear served as a detriment, not a boon, and still most never even considered dropping it. The second example is that of the Blackberry, a phone that’s sole purpose was to send and receive emails. The Blackberry company was worth $70 billion in 2008 and had the chance to pivot its focus to incorporate text messages, but the owner was stuck in thinking that his idea was best and now the Blackberry company has all but fallen into obscurity. The third example is that of Steve Jobs, founder of Apple which was a computer company at the time, and the iphone. When the idea of the iphone was pitched to Steve Jobs he told his developers that the idea was “the dumbest idea [he’d] ever heard.” Unlike the owner of Blackberry, Steve Jobs was open to being wrong. Within four years the iphone accounted for half of Apple’s revenue.
The scientific mindset is shown by scientific studies to improve business revenue. A study was conducted researching the impact of scientific thinking on start-up companies. One group of start-ups adopted a scientific approach of testing their ideas and reevaluating their products based on empirical results which was compared to a control group who did business as normal. The average revenue of the start-up with the scientific mindset after the first year was $12,071.87 while the control groups’ average revenue was $255.40. It pays to rethink your opinions, literally.
The book is replete with psychological knowledge. Anton’s syndrome is a condition in which a person is completely oblivious to a physical disability, quite literally a blind person believing they aren’t blind, instead believing that they are perpetually in a dark room. This condition emphasizes the human tendency to turn a blind eye to what is undesirable. The Dunning-Kreuger effect is a psychological effect about the ignorance of arrogance. The Dunning-Kreuger effect is the human tendency to overestimate competency and says that psychologically we are more likely to be overconfident when we are ignorant. Knowledge breeds reservation.
One thing that distinguishes the scientific mindset from the others is that scientists find joy in being wrong. Admitting that you were wrong means that you are less wrong than you were a second ago. Unfortunately, most people not only get defensive and outright reject knowledge that contradicts their beliefs offhand, but they seek to surround themselves with like-minded individuals that echo their belief system, a habit that only entrenches them deeper in their beliefs. What distinguishes a scientist from others is that a scientist is not attached to their beliefs or ideas. They don’t care if they are wrong. In fact, they seek it out.
Forecasting tournaments are tournaments where participants seek to guess outcomes of future events. They are judged both by the accuracy of their prediction and by their perceived odds of being wrong. “The best forecasters have confidence in their predictions that come true and doubt in their predictions that prove false.” The most important predictor in winning the forecasting tournament? Updating and changing their beliefs.
Our current society does not explicitly value a scientific mindset in many areas of life and being wrong is seen as a negative thing. Or is it? Psychology has studied this concept and most people see admittance of wrongness as a positive trait, not a negative one. The book gives an example of the British scientist Andrew Lyne who made a revolutionary astrological discovery and was set to present the discovery at an astrology conference. Before he could present, he realized that he had miscalculated and his discovery was null and void. He walked into the conference and candidly admitted his mistake. The confession was met with a standing ovation and one astrophysicist called it “the most honorable thing I’ve ever seen.” Psychology even goes a step further and studies show that people who take themselves less seriously, and consequently make fun of themselves (about being wrong) more frequently, are happier statistically. Not only are you viewed more positively by others, but you are also happier if you can get comfortable being wrong.
A big cultural and psychological factor in the umbrella problem of being wrong is how people treat others who are wrong. Fundamental contribution error is the human tendency to underemphasize environmental or contextual factors and overemphasize personality factors when analyzing others’ behavior. In other words, we are judgmental as fuck and blame other people’s characters for their differing viewpoints rather than their life experiences or how they grew up as influencing their opinions. Most people think “well that person is just an idiot” instead of “I wonder what contributed to them arriving at that conclusion.” The fundamental contribution error leads people to attack a person’s character rather than their ideas which results in people digging in their heels, not analyzing whether or not their opinion is factual (on both sides).
The author recommends that everyone has a trusted team of disagreeable people, people who will openly challenge your beliefs. “Their role is to activate rethinking cycles by pushing us to be humble about our expertise, doubt our knowledge, and be curious about new perspectives.” The book cites the invention of the airplane and the success of the movie The Incredibles as examples of how consistent disagreement led to success. The Wright brothers were apparently arguing all the time about how to make a plane work, but their disagreements were in an attempt to find what worked, not to make each other feel stupid. The Incredibles was rejected as an idea because the cost of animation at the time was too high based on traditional animation techniques. The director, not dissuaded, chose a team of disagreeable people to solve the problem of high animation costs because he knew disagreeable people would challenge each other until a feasible solution was found, and it was. The team revolutionized how animation was done for a fraction of the cost and the movie was a resounding success. Disagreement doesn’t have to involve dogmatically talking at someone or tearing someone down, it can be intellectually stimulating and illuminating if your self-esteem isn’t threatened by the possibility of being wrong.
Part 2: Interpersonal Rethinking
How do people change others’ minds? To answer this question Adam Grant analyzed expert negotiators and debaters to see how they change other people's minds and found five main commonalities between them.
They agree with good arguments, even if the opposition makes them. Outright dismissal of good ideas just because it’s made by the opposing side only serves to demonstrate one’s rigid bias. In fact, they do the exact opposite and purposefully seek common ground with their opponents from the start of the conversation.
They focus on the quality of arguments, not the quantity. Bad arguments undermine their position so each point made is thoroughly vetted and researched before any assertion is made. They recognize which of their own arguments are bad ones and why they are bad.
They aren’t offensive or defensive in their attitudes, their position is opinionated neutrality and they try to prove their point through facts, not bias. Facts speak for themselves and attacking or defending a position is an indicator of a commitment to being right for ego’s sake, not truth’s sake. Their worth is not on trial, their opinions are.
They ask questions. They invite their opponent to think about their stance as opposed to beating their opinions into the competition (which doesn’t work). “Psychologists have long found that the person most likely to persuade you to change your mind is you.”
They use their feelings in the conversation more. Feelings serve as a rough draft for experts and they share their feelings as a way of exploring their positions in order to recognize initial bias and rethink their positions.
It should be noted that solely relying on facts isn’t enough. Harish Natarajan, winner of 36 international debate tournaments, competed against a computer in a debate to see who would win. While the audience said they learned more from the computer (makes sense), they changed their opinions more from Harish’s arguments. The computer’s loss was attributed to a rare occurrence in the analysis of 400 million data points of human arguments, agreeing with the opposition. Dogmatism does not change people’s minds. Furthermore, studies show that moderate confidence is more convincing than high or low confidence.
You can lead a horse to water, but you can’t make it think.
The Hierarchy of Arguments
Daryl Davis was a black country singer that convinced over 200 KKK members to leave the organization and was even made the godfather of one of them he influenced, an ex-Imperial Wizard, the national leader of the KKK. How did he do it?
But first, how is prejudice formed? Group polarization is a psychological concept that describes the phenomena of people’s beliefs becoming more extreme when talking to others with the same views. Groupthink is a psychological principle that occurs when members of a group opt to go with the group consensus rather than engage in critical thinking which often leads to irrational or detrimental outcomes. Basically people become more entrenched in their opinions by either amping each other up or peer pressure and psychological laziness. There are many other psychological factors at work besides these two that influence the formation of prejudice but they won’t be covered in this book summary.
Adam Grant conducted several studies to find how to decrease prejudice. He concluded that focusing on a common identity didn’t work, nor did humanizing the other side as the humanized person was deemed an exception to the prejudiced rule. What worked was getting people to think about the arbitrariness of their own position, what made their prejudice baseless. One specific technique that worked was having people consider how they might think differently if they had grown up differently. “People gain humility when they reflect on how different circumstances could have led them to different beliefs.” “Psychologists have found that many of our beliefs are cultural truisms: widely shared, but rarely questioned. If we take a closer look at them, we often discover that they rest on shaky foundations.” Adam Grant’s findings emphasize the importance of unlearning in order to avoid being ignorant.
“What doesn’t sway us can make our beliefs stronger.” In other words, if someone refuses to listen to evidence against their belief when it's presented, their incorrect belief becomes even less likely to be impacted by future contradicting evidence. This is a psychological principle called the backfire effect but perhaps is best described as the doubling-down effect. The sunk cost principle in economics is another idea that produces similar results. Psychologically the average person would rather crash and burn espousing an incorrect idea than an accurate one if it means having to come to terms with something they don’t want to be true. Yikes.
A great book that studies how people make decisions is the book The Righteous Mind: Why Good People Are Divided By Politics and Religion by Jonathan Haidt. What he found by studying tens of thousands of people is that most people don’t make decisions based on logic, even if they think they do. The vast majority of people make decisions based on their emotions, even when their position is undeniably proven to be wrong. “Psychologists find that people will ignore or even deny the existence of a problem if they’re not fond of the solution.” Emotions drive behavior, not logic. We believe what we want to believe, not what is true. That is, unless you think like a scientist.
If presenting contrary evidence doesn’t work, what does? The book recommends a therapy technique called Motivational Interviewing (MI) that has a 75-80% success rate at changing a person’s mind. Core principles of MI include asking open-ended questions (not a yes or no question), engaging in reflective listening (listening with the intention to understand, not convince and checking for correct understanding), and affirming a person’s desire to change and ability to grow (cheerleading of a sort). To use MI effectively, the attitude of the user matters a lot. Effective MI means the user is empathetic, nonjudgmental, and attentive. Trying to use MI for manipulation doesn’t work statistically as people can sense it.
The secret of MI is remarkably simple, it’s just good listening with no other agenda. Unfortunately, most people don’t know how to actually listen and overestimate how good they are at listening. “Among managers rated as the worst listeners by their employees, 94% of them evaluated themselves as good or very good listeners.” “In one poll, a third of women said their pets were better listeners than their partners.” Believing you are good at listening doesn’t mean that you are and, at least based on these studies, being a good listener seems like a rare trait.
A good listener's sole agenda is understanding the person they are listening to, something that is much harder than it sounds (even for a mental health professional like myself who does this for a career. My therapist recently called me a bad listener. He was right. Denial is a bitch). Inverse charisma is a term to capture the magnetic quality of a good listener. The writer E. M. Forster is apparently one such listener and “in the words of one biographer, ‘To speak with him was to be seduced by an inverse charisma, a sense of being listened to with such intensity that you had to be your most honest, sharpest, and best self.’” To sum this idea up, “listening is a way of offering others our scarcest, most precious gift: our attention.”
Part 3: Collective Rethinking
Another contributor to ignorance is a psychological principle called binary bias. Binary bias is “the human tendency to seek clarity and closure by simplifying a complex continuum into two categories.” In other words, thinking in extremes and dividing people into two opposing stances, also known as black and white thinking, one of many erroneous patterns of thinking called thinking errors, is illogical. We intentionally or unintentionally self-generate ignorance in order to avoid cognitive complexity.
How would one go about changing bias due to laziness? By reintroducing complexity! Not being psychologically lazy reduces bias and, therefore, ignorance. Complexity can be increased by seeing any given idea as a spectrum rather than binary categories, recognizing caveats, and highlighting contingencies to one’s opinions. Furthermore, recognizing the value of learning to question knowledge and what we think we know is foundational to healthy thinking.
Most of us aren’t taught to question what we are taught until college, or at least that was my experience, despite how important it is. A simple exercise of reading older generations' school textbooks would be more than adequate to demonstrate that our collective knowledge progresses with time, yet many of us fail to consider such concepts when it comes to reevaluating our own knowledge. Our thinking at 10 should not be representative of what we think at 25, or what we think at 25 representative of what we think at 60.
Active learning is a method of teaching that teaches students to learn, not to just read and regurgitate information. Most people love the passive approach to learning, possibly due to its overall prevalence, despite its comparative ineffectiveness when contrasted to active learning. One study found that students were 1.55 times more likely to fail a class with passive learning compared to active learning. The awestruck effect is a phenomenon where people are less likely to critically evaluate a lecture when the lecturer is eloquent because they are swept up in the emotions of the speech. “Experiments have shown that when a speaker delivers an inspiring message, the audience scrutinizes the material less carefully and forgets more of the content - even while claiming to remember more of it.” So not only are people more likely to unquestionably accept false information, but they are also less likely to remember relevant information with passive learning. The solution? The same as before, be less cognitively lazy.
Active learning means thinking for oneself and applying knowledge. This is accomplished in groups by rigorous group feedback about one’s quality of work (or opinions) and then integrating that feedback to present a new and improved version. The overall impact of this method of rethinking and redoing is that overall quality rises, sometimes quite drastically. In active learning, “quality means rethinking, reworking, and polishing.” In short, active learning teaches new ways of thinking, a quality that distinguishes a good teacher from a great one.
How can a culture of active learning be cultivated? The answer lies in creating psychological safety, the freedom to make mistakes without judgment or punishment. This allows for the correction of errors rather than the hiding of mistakes, a habit that inevitably leads to making the same mistakes over and over again rather than correcting it. Expressing vulnerability about one’s own mistakes also helps others learn that mistakes are just a part of learning, not something to be ashamed about. Process accountability is evaluating how decisions are made throughout a process rather than just focusing on the final product. Process accountability leads to deep, critical thinking and rethinking, a path that can only lead to improved overall outcomes. Confident humility encapsulates the attitude of a healthy active learning culture; confidence in ability and competency while recognizing that everyone, including oneself, both makes mistakes and has room for improvement.
“There is a fine line between heroic persistence and foolish stubbornness.” Rethinking prevents a person from unwittingly engaging in foolish stubbornness and is scientifically proven to produce better outcomes in most, if not all, areas than rigid adherence and refusal to reconsider one’s perspectives and beliefs. There is value in routinely questioning one’s beliefs and updating one’s understanding when new evidence is presented. The alternative is childish. I say childish because refusal to change one’s position is a direct result of a lack of emotional maturity and self esteem. If one never grows out of throwing emotional tantrums when they don’t like the truth, they will never mature.
The book ends by discussing how happiness is attained, and it’s not by valuing happiness more. Mirroring what I’ve learned over the course of years of helping clients find inner happiness, the author asserts that happiness is found by the development of self esteem, the same thing that opens people up to truth and reduces bias. Rethinking doesn’t just apply to concepts and ideas, it applies to how you think about yourself. Learn to define yourself in terms of your values, not your opinions.
Conclusion:
There are many salient points made in this book, but if I had to highlight just one it would be this: whenever you think you know something, stop and ask yourself the question “how could I be wrong?”
Comments