Even when we know better, our brains often rely on inaccurate or misleading information to make future decisions. But why are we so easily influenced by false statements such as “vaccinations cause autism” or “30 million illegal immigrants live in the U.S.?”
People quickly download the inaccurate statements into memory because it’s easier than critically evaluating and analyzing what they’ve heard.
Later, the brain pulls up the incorrect information first because it’s less work to retrieve recently presented material. If it’s available, people tend to think they can rely on it. But just because you can remember what someone said, doesn’t make it true.
It’s even harder to avoid relying on misinformation when accurate and inaccurate information is mixed together.
We’re bombarded with tons of information all day; it’s a nightmare to critically evaluate all of it.
We often assume sources are reliable. It’s not that people are lazy, though that could certainly contribute to the problem. It’s the computational task of evaluating everything that is arduous and difficult, as we attempt to preserve resources for when we really need them.
Why We Fall Prey to Misinformation
by Julie Deardorff
http://www.sesp.northwestern.edu/news-center/news/2016/08/why-we-fall-prey-to-misinformation.html
Even when we know better, our brains often rely on inaccurate or misleading information to make future decisions. But why are we so easily influenced by false statements such as “vaccinations cause autism” or “30 million illegal immigrants live in the U.S.?”
In a new published review, Northwestern University psychologist David Rapp explains that people quickly download the inaccurate statements into memory because it’s easier than critically evaluating and analyzing what they’ve heard.
Later, the brain pulls up the incorrect information first because it’s less work to retrieve recently presented material,” Rapp said. “If it’s available, people tend to think they can rely on it. But just because you can remember what someone said, doesn’t make it true.”
It’s even harder to avoid relying on misinformation when accurate and inaccurate information is mixed together, said Rapp, the Charles Deering McCormick Professor of Teaching Excellence at the School of Education and Social Policy and a professor the department of psychology at the Weinberg College of Arts and Sciences.
“We’re bombarded with tons of information all day; it’s a nightmare to critically evaluate all of it,” said Rapp, who coedited and contributed to the book “Processing Inaccurate Information.”
“We often assume sources are reliable. It’s not that people are lazy, though that could certainly contribute to the problem. It’s the computational task of evaluating everything that is arduous and difficult, as we attempt to preserve resources for when we really need them.”
In the political arena, “many candidates present information that’s just patently wrong, they haven’t thought about it or researched it,” Rapp said. “Then you go on Facebook and see ‘friends’ presenting the incorrect information.”
In his review, published in the journal Current Directions in Psychological Science, Rapp outlines several ways to avoid falling into the misinformation trap:
“Trump just says things, but once you can get them encoded into people’s memories, they believe it, use it or rely on it,” Rapp said. “Disentangling truth from falsehoods when they are mixed up from different sources makes the challenge even more difficult.”
The Consequences of Reading Inaccurate Information
by David N. Rapp
http://journals.sagepub.com/doi/full/10.1177/0963721416649347
We are regularly confronted with statements that are inaccurate, sometimes obviously so. Unfortunately, people can be influenced by and rely upon inaccurate information, engaging in less critical evaluation than might be hoped. Empirical studies have consistently demonstrated that even when people should know better, reading inaccurate information can affect their performance on subsequent tasks. What encourages people’s encoding and use of false statements? The current article outlines how reliance on inaccurate information is a predictable consequence of the routine cognitive processes associated with memory, problem solving, and comprehension. This view helps identify conditions under which inaccurate information is more or less likely to influence subsequent decisions. These conditions are informative in the consideration of information-design approaches and instructional methods intended to support critical thinking.