Why We Continue to Believe False Info Even After We Learned Its Not True
Written by Family First Adolescent Services,
in Section Family Resources
Believing what is wrong is a strange occurrence we all deal with
Is your mental library a haven of accurate and well-informed facts, or are there mistruths hiding on the shelves? It’s natural to assume that we update our beliefs in line with the most recent and well-established evidence. But what really happens to our views when a celebrity endorses a product that becomes discredited by science, or when a newspaper publishes a story which is later retracted?
A recent paper from the Journal of Consumer Psychology presents a novel take on this topic, by investigating the continued influence effect. Anne Hamby and colleagues suggest that our likelihood of continuing to believe retracted information depends on whether or not it helps us to understand the cause-and-effect structure of an event. Crucially, the team proposes, we would rather have a complete understanding of why things happen than a perspective which is more accurate, but less complete.
In the first study, participants read a scenario in which an unwell character takes medication that fails to cure him. Whilst one group was informed the drug was ineffective because the character took it at the wrong time, the other group was given no explanation. Both groups were told the character took the medication with a glass of lemonade and were then informed the drug is ineffective if consumed with citrus-based drinks. Later, all participants were notified that this last fact was untrue.
Information We Later Discover to be False is Harder to Ignore When it Fills an Explanatory “Gap” in a Story
A day later, participants were asked to recall why the drug had been ineffective. Those who received no explanation for the drug failure in the original anecdote were more likely to incorrectly reference the lemonade drink as the reason why the medication didn’t work — even though most remembered that this information had later been retracted. This suggests information we later discover to be false is harder to ignore when it fills an explanatory “gap” in a story.
This idea was further supported by a second study, in which participants read the following extract from a successful poker player: “I reach down and pull out my bottle of kombucha which I like to drink at poker matches. I take a long deep swig from the bottle. And I have clarity of mind. I fold.”
One group of participants was informed kombucha supports mental performance whilst another group was told the drink increases muscular function. This allowed those in the mental performance group to infer a link between kombucha drinking and poker success, whereas the muscular performance group could not. Later, all individuals were told the link between kombucha and mental performance or muscular function was untrue.
Participants who read that kombucha was related to mental performance were more likely to recall the false information as true, and more likely to attribute it to the poker player’s win. When offered a selection of beverages to drink after the experiment, this group was also more likely to select kombucha. Again, it seems that when information supports our understanding of a story’s cause-and-effect structure, it is particularly “sticky”, even when we are told it isn’t true.
In a final study, the team repeated the original failed-medication scenario but gave half the sample a positive ending to the story, in which the patient changes other aspects of his routine and improves. The other half were given a negative ending, in which the patient is unable to receive further treatment and does not get better. The researchers found the retracted information about citrus drinks had less of a continued influence on those who read the negative ending. They suggest this is because we are more motivated to accurately understand what leads to negative outcomes, as this will help us survive similar experiences in the future.
As a whole, it seems we have a general bias toward creating a complete mental picture of an event, rather than one that is factually accurate but lacking a cause-and-effect explanation. However, Hamby and colleagues also indicate this bias can be overcome when we recognize that prioritizing accuracy over completeness will help us in the future, or when a retraction offers an alternative explanation for the cause-and-effect structure of an event. Research such as this, therefore, makes us more aware of the chinks in our mental armor — and better equipped to defend them.
Read the original article – How Stories in Memory Perpetuate the Continued Influence of False Information
Post was written by Rhi Willmot (@rhi_willmot) for BPS Research Digest. Rhi is a psychologist with an interest in wellbeing and has explored how topics from positive psychology influence healthy lifestyle behavior. As a keen runner, Rhi is also interested in the relationship between psychology and optimal performance. She has published internationally and worked on a number of transdisciplinary programs, including an initiative to reduce food waste via altering perceptions of “ugly” fruit and vegetables, and a project to enhance the quality of life in deprived areas of Mexico.