Journalists have written fascinating accounts of the people creating online misinformation or “fake news,” from a group of Macedonian teenagers to the infamous Internet Research Agency, whose members were recently indicted for attempting to influence the 2016 election. Fact-checking groups have circulated advice for spotting fake stories, and partnered with social media platforms to flag them. Quizzes and games like Factitious will test your personal ability to spot misinformation. Overall, the “fake news” news industry is pretty much saturated. But there’s still plenty of room for a game where you become a fake news mogul.
Bad News is a Twine-style game designed by members of the Cambridge Social Decision-Making Lab and media literacy group Drog. Its goal is to teach players about digital misinformation by turning them into purveyors of it. You’ll start with a small-time Twitter account and slowly build followers by feeding their appetite for fear and anger, while deflecting fact-checkers’ criticism and muddling straightforward issues with minimally plausible conspiracy theories.
You win Bad News by destroying the quality of online discourse for your own gain. You lose by acknowledging nuance, tipping too far into fantasy, or fighting with the game’s snarky narrator. It’s reminiscent of The Westport Independent, a game about running a newspaper in a totalitarian state, except that there’s no ideological justification here — just the endless quest for clicks and eyeballs.
“A traditionally left-wing angle. Good choice,” the game told me, after I decided to attack big corporations for corruption. “But you could have gone with a right-wing angle just as well. It doesn’t matter: choose a side and demonize your target as much as possible.” One 2016 survey found that major hyper-partisan right-wing Facebook pages published somewhat more false information than hyper-partisan left-wing ones, but in general, conspiracy theories flourish in whichever party is out of power.
The game’s creators describe Bad News as a way to vaccinate people against misinformation. “The idea is that once you’ve seen the tactics, and used them in the game, you build up resistance,” Cambridge Social Decision-Making Lab director Sander van der Linden told The Guardian. Some of these tactics are specific, like slightly misspelling a major account’s Twitter handle to impersonate it. Others are broader, like appealing to negative emotions.
How effective will this actually be? Its central message is “don’t believe a story just because it makes you feel something,” which isn’t a lesson that’s easily taught through a game. But if nothing else, it gets players thinking about all the ways they could be fooled. And an optional survey evaluates how players gauge credibility online — future results are supposed to be published in the Journal of Risk Research.