You Are Not So Smart Read online




  Table of Contents

  Title Page

  Copyright Page

  Dedication

  Introduction

  Chapter 1 - Priming

  Chapter 2 - Confabulation

  Chapter 3 - Confirmation Bias

  Chapter 4 - Hindsight Bias

  Chapter 5 - The Texas Sharpshooter Fallacy

  Chapter 6 - Procrastination

  Chapter 7 - Normalcy Bias

  Chapter 8 - Introspection

  Chapter 9 - The Availability Heuristic

  Chapter 10 - The Bystander Effect

  Chapter 11 - The Dunning-Kruger Effect

  Chapter 12 - Apophenia

  Chapter 13 - Brand Loyalty

  Chapter 14 - The Argument from Authority

  Chapter 15 - The Argument from Ignorance

  Chapter 16 - The Straw Man Fallacy

  Chapter 17 - The Ad Hominem Fallacy

  Chapter 18 - The Just-World Fallacy

  Chapter 19 - The Public Goods Game

  Chapter 20 - The Ultimatum Game

  Chapter 21 - Subjective Validation

  Chapter 22 - Cult Indoctrination

  Chapter 23 - Groupthink

  Chapter 24 - Supernormal Releasers

  Chapter 25 - The Affect Heuristic

  Chapter 26 - Dunbar’s Number

  Chapter 27 - Selling Out

  Chapter 28 - Self-Serving Bias

  Chapter 29 - The Spotlight Effect

  Chapter 30 - The Third Person Effect

  Chapter 31 - Catharsis

  Chapter 32 - The Misinformation Effect

  Chapter 33 - Conformity

  Chapter 34 - Extinction Burst

  Chapter 35 - Social Loafing

  Chapter 36 - The Illusion of Transparency

  Chapter 37 - Learned Helplessness

  Chapter 38 - Embodied Cognition

  Chapter 39 - The Anchoring Effect

  Chapter 40 - Attention

  Chapter 41 - Self-Handicapping

  Chapter 42 - Self-Fulfilling Prophecies

  Chapter 43 - The Moment

  Chapter 44 - Consistency Bias

  Chapter 45 - The Representativeness Heuristic

  Chapter 46 - Expectation

  Chapter 47 - The Illusion of Control

  Chapter 48 - The Fundamental Attribution Error

  Acknowledgements

  BIBLIOGRAPHY

  DUTTON

  Published by Penguin Group (USA) Inc.

  375 Hudson Street, New York, New York 10014, U.S.A.

  Penguin Group (Canada), 90 Eglinton Avenue East, Suite 700, Toronto, Ontario M4P

  2Y3, Canada (a division of Pearson Penguin Canada Inc.) • Penguin Books Ltd, 80

  Strand, London WC2R 0RL, England • Penguin Ireland, 25 St. Stephen’s Green, Dublin

  2, Ireland (a division of Penguin Books Ltd) • Penguin Group (Australia), 250 Camber-

  well Road, Camberwell, Victoria 3124, Australia (a division of Pearson Australia Group

  Pty Ltd)• Penguin Books India Pvt Ltd, 11 Community Centre, Panchsheel Park, New

  Delhi–110 017, India • Penguin Group (NZ), 67 Apollo Drive, Rosedale, Auckland 0632,

  New Zealand (a division of Pearson New Zealand Ltd) • Penguin Books (South Africa)

  (Pty) Ltd, 24 Sturdee Avenue, Rosebank, Johannesburg 2196, South Africa

  Penguin Books Ltd, Registered Offices: 80 Strand, London WC2R 0RL, England

  Published by Dutton, a member of Penguin Group (USA) Inc.

  First printing, November 2011

  Copyright © 2011 by David McRaney

  All rights reserved

  REGISTERED TRADEMARK—MARCA REGISTRADA

  LIBRARY OF CONGRESS CATALOGING-IN-PUBLICATION DATA

  has been applied for.

  ISBN : 978-1-101-54535-5

  Without limiting the rights under copyright reserved above, no part of this publication may be reproduced, stored in or introduced into a retrieval system, or transmitted, in any form, or by any means (electronic, mechanical, photocopying, recording, or otherwise), without the prior written permission of both the copyright owner and the above publisher of this book.

  The scanning, uploading, and distribution of this book via the Internet or via any other means without the permission of the publisher is illegal and punishable by law. Please purchase only authorized electronic editions, and do not participate in or encourage electronic piracy of copyrighted materials. Your support of the author’s rights is appreciated.

  While the author has made every effort to provide accurate telephone numbers and Internet addresses at the time of publication, neither the publisher nor the author assumes any responsibility for errors, or for changes that occur after publication. Further, the publisher does not have any control over and does not assume any responsibility for author or third-party websites or their content.

  http://us.penguingroup.com

  For Jerry, Evelyn, and Amanda

  INTRODUCTION

  You

  THE MISCONCEPTION: You are a rational, logical being who sees the world as it really is.

  THE TRUTH: You are as deluded as the rest of us, but that’s OK, it keeps you sane.

  You hold in your hands a compendium of information about self-delusion and the wonderful ways we all succumb to it.

  You think you know how the world works, but you really don’t. You move through life forming opinions and cobbling together a story about who you are and why you did the things you did leading up to reading this sentence, and taken as a whole it seems real.

  The truth is, there is a growing body of work coming out of psychology and cognitive science that says you have no clue why you act the way you do, choose the things you choose, or think the thoughts you think. Instead, you create narratives, little stories to explain away why you gave up on that diet, why you prefer Apple over Microsoft, why you clearly remember it was Beth who told you the story about the clown with the peg leg made of soup cans when it was really Adam, and it wasn’t a clown.

  Take a moment to look around the room in which you are reading this. Just for a second, see the effort that went into not only what you see, but the centuries of progress leading to the inventions surrounding you.

  Start with your shoes, and then move to the book in your hands, then look to the machines and devices grinding and beeping in every corner of your life—the toaster, the computer, the ambulance wailing down a street far away. Contemplate, before we get down to business, how amazing it is humans have solved so many problems, constructed so much in all the places where people linger.

  Buildings and cars, electricity and language—what a piece of work is man, right? What triumphs of rationality, you know? If you really take it all in, you can become enamored with a smug belief about how smart you and the rest of the human race have become.

  Yet you lock your keys in the car. You forget what it was you were about to say. You get fat. You go broke. Others do it too. From bank crises to sexual escapades, we can all be really stupid sometimes.

  From the greatest scientist to the most humble artisan, every brain within every body is infested with preconceived notions and patterns of thought that lead it astray without the brain knowing it. So you are in good company. No matter who your idols and mentors are, they too are prone to spurious speculation.

  Take the Wason Selection Task as our first example. Imagine a scientist deals four cards out in front of you. Unlike normal playing cards, these have single numbers on one side and single colors on the other. You see from left to right a three, an eight, a red card, and a brown card. The shifty psychologist allows you to take in the peculiar cards for a moment and poses a question. Suppose the psychologist says, “I have a deck full of these st
range cards, and there is one rule at play. If a card has an even number on one side, then it must be red on the opposite side. Now, which card or cards must you flip to prove I’m telling the truth?”

  Remember—three, eight, red, brown—which do you flip?

  As psychological experiments go, this is one of the absolute simplest. As a game of logic, this too should be a cinch to figure out. When psychologist Peter Wason conducted this experiment in 1977, less than 10 percent of the people he asked got the correct answer. His cards had vowels instead of colors, but in repetitions of the test where colors were used, about the same number of people got totally confused when asked to solve the riddle.

  So what was your answer? If you said the three or the red card, or said only the eight or only the brown, you are among the 90 percent of people whose minds get boggled by this task. If you turn over the three and see either red or brown, it does not prove anything. You learn nothing new. If you turn over the red card and find an odd number, it doesn’t violate the rule. The only answer is to turn over both the eight card and the brown card. If the other side of the eight is red, you’ve only confirmed the rule, but not proven if it is broken elsewhere. If the brown has an odd number, you learn nothing, but if it has an even number you have falsified the claims of the psychologist. Those two cards are the only ones which provide answers. Once you know the solution, it seems obvious.

  What could be simpler than four cards and one rule? If 90 percent of people can’t figure this out, how did humans build Rome and cure polio? This is the subject of this book—you are naturally hindered into thinking in certain ways and not others, and the world around you is the product of dealing with these biases, not overcoming them.

  If you replace the numbers and colors on the cards with a social situation, the test becomes much easier. Pretend the psychologist returns, and this time he says, “You are at a bar, and the law says you must be over twenty-one years old to drink alcohol. On each of these four cards a beverage is written on one side and the age of the person drinking it on the other. Which of these four cards must you turn over to see if the owner is obeying the law?” He then deals four cards which read:

  23—beer—Coke—17

  Now it seems much easier. Coke tells you nothing, and 23 tells you nothing. If the seventeen-year-old is drinking alcohol, the owner is breaking the law, but if the seventeen-year-old isn’t, you must check the age of the beer drinker. Now the two cards stick out—beer and 17. Your brain is better at seeing the world in some ways, like social situations, and not so good in others, like logic puzzles with numbered cards.

  This is the sort of thing you will find throughout this book, with explanations and musings to boot. The Wason Selection Task is an example of how lousy you are at logic, but you are also filled with beliefs that look good on paper but fall apart in practice. When those beliefs fall apart, you tend not to notice. You have a deep desire to be right all of the time and a deeper desire to see yourself in a positive light both morally and behaviorally. You can stretch your mind pretty far to achieve these goals.

  The three main subjects in this book are cognitive biases, heuristics, and logical fallacies. These are components of your mind, like organs in your body, which under the best conditions serve you well. Life, unfortunately, isn’t always lived under the best conditions. Their predictability and dependability have kept confident men, magicians, advertisers, psychics, and peddlers of all manner of pseudoscientific remedies in business for centuries. It wasn’t until psychology applied rigorous scientific method to human behavior that these self-deceptions became categorized and quantified.

  Cognitive biases are predicable patterns of thought and behavior that lead you to draw incorrect conclusions. You and everyone else come into the world preloaded with these pesky and completely wrong ways of seeing things, and you rarely notice them. Many of them serve to keep you confident in your own perceptions or to inhibit you from seeing yourself as a buffoon. The maintenance of a positive self-image seems to be so important to the human mind you have evolved mental mechanisms designed to make you feel awesome about yourself. Cognitive biases lead to poor choices, bad judgments, and wacky insights that are often totally incorrect. For example, you tend to look for information that confirms your beliefs and ignore information that challenges them. This is called confirmation bias. The contents of your bookshelf and the bookmarks in your Web browser are a direct result of it.

  Heuristics are mental shortcuts you use to solve common problems. They speed up processing in the brain, but sometimes make you think so fast you miss what is important. Instead of taking the long way around and deeply contemplating the best course of action or the most logical train of thought, you use heuristics to arrive at a conclusion in record time. Some heuristics are learned, and others come free with every copy of the human brain. When they work, they help your mind stay frugal. When they don’t, you see the world as a much simpler place than it really is. For example, if you notice a rise in reports about shark attacks on the news, you start to believe sharks are out of control, when the only thing you know for sure is the news is delivering more stories about sharks than usual.

  Logical fallacies are like math problems involving language, in which you skip a step or get turned around without realizing it. They are arguments in your mind where you reach a conclusion without all the facts because you don’t care to hear them or have no idea how limited your information is. You become a bumbling detective. Logical fallacies can also be the result of wishful thinking. Sometimes you apply good logic to false premises; at other times you apply bad logic to the truth. For instance, if you hear Albert Einstein refused to eat scrambled eggs, you might assume scrambled eggs are probably bad for you. This is called the argument from authority. You assume if someone is super-smart, then all of that person’s decisions must be good ones, but maybe Einstein just had peculiar taste.

  With each new subject in these pages you will start to see yourself in a new way. You will soon realize you are not so smart, and thanks to a plethora of cognitive biases, faulty heuristics, and common fallacies of thought, you are probably deluding yourself minute by minute just to cope with reality.

  Don’t fret. This will be fun.

  1

  Priming

  THE MISCONCEPTION: You know when you are being influenced and how it is affecting your behavior.

  THE TRUTH: You are unaware of the constant nudging you receive from ideas formed in your unconscious mind.

  You are driving home from the grocery store and you realize you forgot to buy spinach dip, which was the only reason you went there in the first place. Maybe you could buy some at a gas station. Nah, you’ll just get it next trip. Thoughts of dip lead to ruminations on the price of gas, which lead to excogitation over bills, which leads to thoughts about whether you can afford a new television, which reminds you of the time you watched an entire season of Battlestar Gallactica in one sitting—what the hell? You are home already and have no recollection of the journey.

  You drove home in a state of highway hypnosis, your mind and body seemingly floating along in parallel. When you stopped the car and turned the key, you snapped out of a dreamlike state sometimes called line hypnosis when describing the dissociative mental world of an assembly line worker stuck in a repetitive grind. In this place, consciousness drifts as one mental task goes into autopilot and the rest of the mind muses about less insipid affairs, floating away into the umbra.

  You split your subjective experience into consciousness and subconsciousness all the time. You are doing it right now—breathing, blinking, swallowing, maintaining your posture, and holding your mouth closed while you read. You could pull those systems into conscious control or leave them to the autonomic nervous system. You could drive cross-country consciously adjusting your foot on the gas pedal, shifting your hands on the wheel, mulling over the millions of micro decisions needed to avoid gnashing metallic death at high speeds, or you could sing along with your friends while the
other parts of your mind handle the mundane stuff. You accept your unconscious mind as just another weird component of the human experience, but you tend to see it as a separate thing—a primal self underneath consciousness that doesn’t have the keys to the car.

  Science has learned otherwise.

  A great example of how potent a force your unconscious can be was detailed by researchers Chen-Bo Zhong at the University of Toronto and Katie Liljenquist at Northwestern in a 2006 paper published in the journal Science. They conducted a study in which people were asked to remember a terrible sin from their past, something they had done which was unethical. The researchers asked them to describe how the memory made them feel. They then offered half of the participants the opportunity to wash their hands. At the end of the study, they asked subjects if they would be willing to take part in later research for no pay as a favor to a desperate graduate student. Those who did not wash their hands agreed to help 74 percent of the time, but those who did wash agreed only 41 percent of the time. According to the researchers, one group had unconsciously washed away their guilt and felt less of a need to pay penance.

  The subjects didn’t truly wash away their emotions, nor did they consciously feel as though they had. Cleansing has meaning beyond just avoiding germs. According to Zhong and Liljenquist, most human cultures use the ideas of cleanliness and purity as opposed to filth and grime to describe both physical and moral states. Washing is part of many religious rituals and metaphorical phrases used in everyday language, and referring to dastardly deeds as being dirty or to evil people as scum is also common. You even make the same face when feeling disgusted about a person’s actions as you do when seeing something gross. Unconsciously, the people in the study connected their hand washing with all the interconnected ideas associated with the act, and then those associations influenced their behavior.