Social media experiment shows it’s possible to get millions of users to recognise misinformation and avoid it themselves.
Social media platforms have tried to fight false claims or debunk them. However, misinformation continues to cause more damage than ever before. It is influencing everything from elections and public health to the treatment of immigrants and refugees.
In collaboration with Jigsaw, a unit within Google that tackles threats to open societies, a team of researchers at the University of Cambridge and the University of Bristol in the United Kingdom presented a simple solution they call prebunking in the journal ‘Science Advances’. Prebunking is an easy way of inoculating people against misinformation by teaching them some basic critical thinking skills. The approach is based on the inoculation theory that explains how our attitudes or beliefs can be shielded against persuasion or influence just like our bodies can be protected against disease.To demonstrate that a single viewing of a video clip boosts awareness of misinformation, the researchers conducted a study on nearly 30 000 participants on YouTube. They created and uploaded a series of short animated videos into ad slots exploring various manipulative communications aimed at spreading false information.
Results showed that prebunking was effective. Regardless of factors such as education levels and personality traits, watching short inoculation videos improved the subjects’ ability to identify manipulation methods frequently used in online misinformation. They were much better at distinguishing false information than before.
“Our research provides the necessary proof of concept that the principle of psychological inoculation can readily be scaled across hundreds of millions of users worldwide,” co-author Prof. Sander van der Linden, Head of the Social Decision-Making Lab at Cambridge that led the study, commented in a news article from the same university.
“Harmful misinformation takes many forms, but the manipulative tactics and narratives are often repeated and can therefore be predicted,” explained co-author Beth Goldberg, Head of R&D for Google’s Jigsaw Unit. “Teaching people about techniques like ad-hominem attacks that set out to manipulate them can help build resilience to believing and spreading misinformation in the future. We’ve shown that video ads as a delivery method of prebunking messages can be used to reach millions of people, potentially before harmful narratives take hold.”
“Propaganda, lies and misdirections are nearly always created from the same playbook,” added co-author Prof. Stephan Lewandowsky from the University of Bristol. “Fact-checkers can only rebut a fraction of the falsehoods circulating online. We need to teach people to recognise the misinformation playbook, so they understand when they are being misled.”Google is already putting the findings to good use. Jigsaw will run a prebunking campaign in Czechia, Poland and Slovakia to combat misinformation about Ukrainian refugees. However, there’s a lot of scepticism out there, and people don’t really trust tech companies.
“But, at the end of the day, we have to face reality, in that social media companies control much of the flow of information online. So in order to protect people, we have come up with independent, evidence-based solutions that social media companies can actually implement on their platforms,” Prof. van der Linden told the ‘BBC’. “To me, leaving social media companies to their own devices is not going to generate the type of solutions that empower people to discern misinformation that spreads on their platforms.”