A short online game in which players are recruited as a “Chief Disinformation Officer” and use tactics such as trolling to sabotage elections in a peaceful town has been shown to reduce susceptibility to political misinformation in its users.
Game combats political misinformation by letting players undermine democracy
It has been created by University of Cambridge psychologists with support from the US Department of State's Global Engagement Center and Department of Homeland Security Cybersecurity and Infrastructure Security Agency (CISA).
The gameplay is based on “inoculation theory”: that exposing people to a weak “dose” of common techniques used to spread fake news allows them to better identify and disregard misinformation when they encounter it in future.
In this case, by understanding how to incite political division in the game using everything from bots and conspiracies to fake experts, players get a form of “psychological vaccine” against the product of these techniques in the real world.
“Trying to debunk misinformation after it has spread is like shutting the barn door after the horse has bolted. By pre-bunking, we aim to stop the spread of fake news in the first place,” said Dr Sander van der Linden, Director of the Cambridge Social Decision-Making lab and senior author of the new study.
Twitter has started using a “pre-bunk” approach: highlighting types of fake news likely to be encountered in feeds during the US election. However, researchers argue that familiarising people with techniques behind misinformation builds a “general inoculation”, reducing the need to rebut each individual conspiracy.
In the 10-minute game Harmony Square, a small town neighbourhood “obsessed with democracy” comes under fire as players bait the square’s “living statute”, spread falsehoods about its candidate for “bear controller”, and set up a disreputable online news site to attack the local TV anchor.
“The game itself is quick, easy and tongue-in-cheek, but the experiential learning that underpins it means that people are more likely to spot misinformation, and less likely to share it, next time they log on to Facebook or YouTube,” said Dr Jon Roozenbeek, a Cambridge psychologist and lead author of the study.
Reproduced courtesy of the University of Cambridge
The University of Cambridge is acknowledged as one of the world's leading higher education and research institutions. The University was instrumental in the formation of the Cambridge Network and its Vice- Chancellor, Professor Stephen Toope, is also the President of the Cambridge Network.