YouTube Viewers May Be ‘Inoculated’ Against Online Misinformation, Study Finds
A series of studies found that YouTube viewers who were presented with short informational disinformation videos intended to “inoculate” them against harmful social media content were more likely to recognize unreliable information and less likely to share it with others, findings that the researchers say could help combat online misinformation.
A team of researchers led by the University of Cambridge, the University of Bristol and Beth Goldberg, head of development and research at Jigsaw, created five 90-second videos to “inoculate” study participants against the misinformation by teaching people the warning signs to watch out for. based on what psychologists call the “inoculation theory,” or the belief that if given a microdose of misinformation early on, people will be less susceptible to it in the future.
The peer-reviewed study was funded by Google Jigsaw, an arm of Alphabet that aims to develop technological solutions to societal problems (Alphabet also owns YouTubewhich has come under intense scrutiny from critics who say the platform amplifies misinformation and harmful content).
After viewing the videos, the researchers asked the participants to identify the manipulation techniques, and compared to a control group, the “inoculated” people were in some cases more than twice as good at identifying the techniques, according to the researcher. study published Wednesday in the journal Scientists progress.
In a final study – the first real-life example testing the inoculation theory on a social media platform – Google Jigsaw, an arm of Alphabet that aims to create technological solutions to societal problems, found that when YouTube viewers in the US were exposed to one of the inoculation videos, their abilities to recognize the manipulation techniques increased by 5%, which Google says is significant and five times higher than the returns of its similarly sized YouTube ad campaigns.
These results show that psychological inoculation can “be easily scaled up to hundreds of millions of users worldwide,” said study co-author Sander van der Linden, head of the Social Decision-Making Lab at Cambridge. , who led the study, in a statement.
“Pre-dismantling” misinformation can also be more effective than conventional fact-checking, which the authors noted is “impossible” to do on a large scale and can actually worsen the spread of conspiracy theories. when debunking feels like a personal attack on people. who hold these beliefs, according to the University of Cambridge.
Each video specifically focused on one of five common manipulation techniques, such as using emotionally manipulative language, inconsistency, false dichotomies, scapegoating, and ad hominem attacks. Clips are available for see here.
Jigsaw is set to roll out a “prebunking” campaign later this month across multiple platforms targeting users in Poland, Slovakia and the Czech Republic in an effort to counter misinformation related to Ukrainian refugees.
For several years, social media companies have focused on combating the spread of disinformation on their platforms, particularly regarding political news, such as the results of the 2020 presidential election, and information on health, such as the safety and effectiveness of Covid-19 vaccines. . More recently, platforms have struggled with how to address the circulation of misinformation about abortion.