David McRaney

  The Misperception The Truth
You You are a rational, logical being who sees the world as it really is. The Truth: You are as deluded as the rest of us, but that’s OK, it keeps you sane.
Inattentional Blindness You see everything going on before your eyes, taking in all the information like a camera. You are only aware of a small amount of the total information your eyes take in, and even less is processed by your conscious mind and remembered.
Change Blindness You are aware of everything coming into your brain from your eyes from moment to moment. The brain can’t keep up the total amount of information coming in from your eyes; your experience from moment to moment is edited for simplicity.
Learned Helplessness If you are in a bad situation, you will do whatever you can do to escape it. If you feel like you aren’t in control of your destiny, you will give up and accept whatever situation you are in.
Placebo Buttons All buttons placed around you do your bidding. Many public buttons are only there to comfort you.

2 thoughts on “David McRaney

  1. shinichi Post author

    Introduction from YOU ARE NOT SO SMART

    by David McRaney

    https://youarenotsosmart.files.wordpress.com/2012/04/samplechapter.pdf

    You

    The Misconception: You are a rational, logical being who sees the world as it really is.

    The Truth: You are as deluded as the rest of us, but that’s OK, it keeps you sane.

    You hold in your hands a compendium of information about self-delusion and the wonderful ways we all succumb to it.

    You think you know how the world works, but you really don’t. You move through life forming opinions and cobbling together a story about who you are and why you did the things you did leading up to reading this sentence, and taken as a whole it seems real.

    The truth is, there is a growing body of work coming out of psychology and cognitive science which says you have no clue why you act the way you do, choose the things you choose, or think the thoughts you think. Instead, you create narratives, little stories to explain away why you gave up on that diet, why you prefer Apple over Microsoft, why you clearly remember it was Beth who told you the story about the clown with the peg leg made of soup cans when it was really Adam, and it wasn’t a clown.

    Take a moment to look around the room in which you are reading this. Just for a second, see the effort which went into not only what you see, but the centuries of progress leading to the inventions surrounding you.

    Start with your shoes, and then move to the book in your hands, then look to the machines and devices grinding and beeping in every corner of your life—the toaster, the computer, the ambulance wailing down a street far away. Contemplate, before we get down to business, how amazing it is humans have solved so many problems, constructed so much in all the places where people linger.

    Buildings and cars, electricity and language—what a piece of work is man, right? What triumphs of rationality, you know? If you really take it all in, you can become enamored with a smug belief about how smart you and the rest of the human race have become.

    Yet you lock your keys in the car. You forget what it was you were about to say. You get fat. You go broke. Others do it, too. From bank crises to sexual escapades, we can all be really stupid sometimes.

    From the greatest scientist to the most humble artisan, every brain within every body is infested with preconceived notions and patterns of thought which lead them astray without them knowing it. So, you are in good company. No matter who your idols and mentors are, they too are prone to spurious speculation.

    Take the Wason Selection Task as our first example. Imagine a scientist deals four cards out in front of you. Unlike normal playing cards, these have single numbers on one side and single colors on the other. You see from left to right a three, an eight, a red card, and a brown card. The shifty psychologist allows you to take in the peculiar cards for a moment and poses a question. Suppose, the psychologist says, “I have a deck full of these strange cards, and there is one rule at play. If a card has an even number on one side, then it must be red on the opposite side. Now, which card or cards must you flip to prove I’m telling the truth?”

    Remember—three, eight, red, brown—which do you flip?

    As psychological experiments go, this is one of the absolute simplest. As a game of logic, this too should be a cinch to figure out. When psychologist Peter Wason conducted this experiment in 1977, less than 10 percent of the people he asked got the correct answer. His cards had vowels instead of colors, but in repetitions of the test where colors were used, about the same number of people got totally confused when asked to solve the riddle.

    So, what was your answer? If you said the three or the red card, or said only the eight or only the brown, you are among the 90 percent of people whose minds get boggled by this task. If you turn over the three and see either red or brown, it does not prove anything. You learn nothing new. If you turn over the red card and find an odd number, it doesn’t violate the rule. The only answer is to turn over both the eight card and the brown card. If the other side of the eight is red, you’ve only confirmed the rule, but not proven if it is broken elsewhere. If the brown has an odd number, you learn nothing, but if it has an even number you have falsified the claims of the psychologist. Those two cards are the only ones which provide answers. Once you know the solution, it seems obvious.

    What could be simpler than four cards and one rule? If 90 percent of people can’t figure this out, how did humans build Rome and cure polio? This is the subject of this book—you are naturally hindered into thinking in certain ways and not others, and the world around you is the product of dealing with these biases, not overcoming them.

    If you replace the numbers and colors on the cards with a social situation, the test becomes much easier. Pretend the psychologist returns, and this time he says, “You are at a bar, and the law says you must be over 21-years-old to drink alcohol. On each of these four cards a beverage is written on one side, and the age of the person drinking it on the other. Which of these four cards must you turn over to see if the owner is obeying the law?” He then deals four cards which read:

    23—beer—Coke—17

    Now, it seems much easier. Coke tells you nothing, and 23 tells you nothing. If the 17-yearold is drinking alcohol, he’s breaking the law, but if he isn’t you must check the age of the beer drinker. Now the two cards stick out—beer and 17. Your brain is better at seeing the world in some ways, like social situations, and not so good in others, like logic puzzles with numbered cards.

    This is the sort of thing you will find throughout this book, with explanations and musings to boot. The Wason Task is an example of how lousy you are at logic, but you are also filled with beliefs which look good on paper but fall apart in practice. When those beliefs fall apart, you tend not to notice. You have a deep desire to be right all of the time and a deeper desire to see yourself in a positive light both morally and behaviorally. You can stretch your mind pretty far to achieve these goals.

    The three main subjects in this book are cognitive biases, heuristics, and logical fallacies. These are components of your mind, like organs in your body, which under the best conditions serve you well. Life, unfortunately, isn’t always lived under the best conditions. Their predictability and dependability have kept confidence men, magicians, advertisers, psychics, and peddlers of all manner of pseudoscientific remedies in business for centuries. It wasn’t until psychology applied rigorous scientific method to human behavior that these self-deceptions became categorized and quantified.

    Cognitive biases are predicable patterns of thought and behavior that lead you to draw incorrect conclusions. You and everyone else comes into the world preloaded with these pesky and completely wrong ways of seeing things, and you rarely notice them. Many of them serve to keep you confident in your own perceptions or to inhibit you from seeing yourself as a buffoon. The maintenance of a positive self-image seems to be so important to the human mind you have evolved mental mechanisms designed to make you feel awesome about yourself. Cognitive biases lead to poor choices, bad judgments, and wacky insights that are often totally incorrect. For example, you tend to look for information that confirms your beliefs and ignore information that challenges them. This is called confirmation bias. The contents of your bookshelf and the bookmarks in your web browser are a direct result of it.

    Heuristics are mental shortcuts you use to solve common problems. They speed up processing in the brain, but sometimes make you think so fast you miss what is important. Instead of taking the long way around and deeply contemplating the best course of action or the most logical train of thought, you use heuristics to arrive at a conclusion in record time. Some heuristics are learned, and others come free with every copy of the human brain. When they work, they help your mind stay frugal. When they don’t, you see the world as a much simpler place than they help your mind stay frugal. When they don’t, you see the world as a much simpler place than it really is. For example, if you notice a rise in reports about shark attacks on the news you start to believe sharks are out of control when the only thing you know for sure is the news is delivering more stories about sharks than usual.

    Logical fallacies are like math problems involving language in which you skip a step or get turned around without realizing it. They are arguments in your mind where you reach a conclusion without all the facts because you don’t care to hear them or have no idea how limited your information is. You become a bumbling detective. Logical fallacies can also be the result of wishful thinking. Sometimes you apply good logic to false premises, at other times you apply bad logic to the truth. For instance, if you hear Albert Einstein refused to eat scrambled eggs, you might assume scrambled eggs are probably bad for you. This is called the argument from authority. You assume if someone is super smart, then all of their decisions must be good ones, but maybe Einstein just had peculiar taste.

    With each new subject in these pages you will start to see yourself in a new way. You will soon realize you are not so smart, and thanks to a plethora of cognitive biases, faulty heuristics and common fallacies of thought, you are probably deluding yourself minute by minute just to cope with reality.

    Don’t fret. This will be fun.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *