recent

To help improve the accuracy of generative AI, add speed bumps

4 capabilities of a real-time business

Slack CEO: How to roll out artificial intelligence internally

Credit: J.J. Gouin / Shutterstock

Ideas Made to Matter

Behavioral Science

Study shows impact of misleading headlines from mainstream news

By

One year after COVID-19 vaccines arrived, the U.S. had a vaccination rate of about 64%. That rate — lower than in most other countries with comparable access to vaccines — resulted in many preventable deaths.

A lot of blame was cast about, much of it targeting fake news stories and the social media platforms on which they spread. But new research by Jennifer Allen, SM ’22, PhD ’24, found that another overlooked source had a stronger influence on slowing vaccination rates: slightly misleading or provocative headlines from mainstream news sources.

“It’s true that misinformation flagged by fact-checkers is really persuasive when people see it,” said MIT Sloan professor who coauthored the paper along with Duncan Watts, a professor at the Wharton School. But this persuasive power is offset by the fact that not many people are exposed to fake news stories: Fringe outlets don’t have huge reach, and the content is often removed quickly from social media platforms.

Rather, information that is not flagrantly false but that still raises skepticism about vaccines has more of an impact when it appears in mainstream media outlets and reaches a far larger audience. “When you net it out, the stuff that is not flagged by fact-checkers has a much bigger impact,” said Rand. “Though it’s not quite as persuasive as the false news stories, far more people see it.”

A new tool for finding causal effects

One of the breakthroughs in this research, described in an article recently published in the journal Science, was the method used to measure the degree to which different news stories influenced readers.

To quantify the persuasiveness of different content, the researchers conducted randomized experiments in which they showed thousands of survey participants headlines from 130 vaccine-related stories, both mainstream content and known misinformation. They then tested how each headline affected participants’ intentions to get vaccinated against COVID-19.

The researchers asked a separate group of participants to rate the headlines across various attributes, including plausibility and political leaning. They found that headlines suggesting that vaccines could be harmful to health, whether substantiated or not, most reliably depressed intentions to get vaccinated.

4 6

Mainstream stories referencing potential harmful effects reduced vaccination intentions 46 times more than those flagged as misinformation by fact-checkers.

Combining these results, Allen and her coauthors were able to blend human survey responses and natural language processing tools to predict the persuasive power of every vaccine-related headline viewed by more than 100 people on Facebook in the first three months of the vaccine rollout. This result allowed them to roughly determine the number of people who decided to forgo vaccination because of the headlines they saw.

Importantly, this method generalizes beyond vaccination rates and could be used to understand social media posts’ causal effects on any outcome, from brand attitudes to political polarization.

46 times more potent

Allen and her coauthors found that exposure to stories they came to define as “vaccine-skeptical” — that is, stories that were not false and alluded to potentially harmful health effects resulting from the vaccine — reduced vaccination intentions 46 times more than misinformation flagged by fact-checkers.

“If we translate this into a specific number, we find that about 3 million people could have gotten vaccinated had they not been exposed to these stories,” Allen said. “Of course, there is then some correlation between vaccine uptake and lives saved, and so the number of preventable deaths also turns out to be relatively large.”

The basic explanation for this result is audience size. In total, vaccine-related headlines that the researchers looked at received 2.7 billion views on Facebook. Content flagged as misinformation received just 0.3% of them. The most influential vaccine-skeptical headline, which was published by the Chicago Tribune, read “A Healthy Doctor Died Two Weeks After Getting a COVID Vaccine; CDC Is Investigating Why.” It reached more than 20% of Facebook’s U.S. user base and received more than six times as many views as all flagged misinformation combined.

Quantifying trade-offs

For media outlets, these results suggest a need for greater vigilance when writing headlines, especially given that more than 90% of people read nothing beyond that when browsing social media. “When you are writing a headline, you should not just be asking yourself if it’s false or not,” Rand said. “You should be asking yourself if the headline is likely to cause inaccurate perceptions.”

Related Articles

Digital literacy doesn't stop spread of misinformation
How informed are voters about political news?
Report details COVID-19’s impact on the digital economy

For the platforms, these results suggest the need for a more nuanced approach to content moderation. While identifying and eradicating the most egregiously false information is important, that process does not necessarily eliminate the most harmful information. Platforms ought to consider the reach of content, too, and potentially devote more attention to understanding and limiting the unfettered spread of harmful content that is misleading without being literally false.

While this suggestion complicates existing challenges around rights of free expression, the method provided by the researchers at the very least allows for more informed debate about what should and should not be acceptable. “In the case of vaccines, we were able to estimate that the people who saw this content were 2.3 percentage points less likely to get vaccinated,” Allen said. “We, as a society, may decide that’s an acceptable trade-off when balanced against free expression, or we may not, but it was previously impossible to make an informed decision because we didn’t know the magnitude of the effect or which content was most responsible.” 

Read next: MIT Sloan research about social media, misinformation, and elections

For more info Sara Brown Senior News Editor and Writer