Facebook, Twitter, and Alphabet, Google's parent company, participated in three online experiments last year to gauge the effectiveness of counter-messaging campaigns in combatting propaganda from extremist groups. The results, published on Monday in a study from the Institute for Strategic Dialogue (ISD), suggest that such efforts could be effective in reaching target audiences and driving conversations among them, though it remains unclear whether counter-narratives can actually deter radicalization. The Wall Street Journal first reported on the study, which was funded by Alphabet with additional support from Facebook and Twitter.
Faced with the threat of online radicalization carried out by ISIS and other extremist groups, various organizations and internet companies have begun exploring the use of counter speech: web content that aims to discredit extremist propaganda. The hope is that such campaigns can dissuade people from joining terrorist and far-right groups, or convince those who have joined to leave. And although experts say it's difficult to quantify the impact that counter-speech could have on reducing radicalization, this week's study suggests that it can at least be an effective way of sparking dialogue among targeted users.
"It's very much about whether you've reached your audience."
"Virality is very much a red herring when it comes to counter-narratives," says Tanya Silverman, project coordinator at ISD and co-author of the study. "It's very much about whether you've reached your audience, and that's a measure of success."
The study from ISD, a London-based think tank, is based on three video campaigns launched in October 2015 and targeting users in the US, UK, and Pakistan. A US nonprofit called Average Mohammed published five animated videos to explain Islam and discredit jihadism among Somali teens living in the States; Harakat-ut-Taleem, an anonymous group based in Pakistan, created six videos to deter people from joining the Taliban; and ExitUSA, a project launched by the US nonprofit Life After Hate, consisted of four videos aimed at discrediting white supremacists and other far-right groups.
After about three weeks, the 15 videos garnered a total of 378,000 views on Facebook, Twitter, and YouTube, and more than 20,000 engagements — a metric that includes likes, shares, comments, and retweets. The three organizations also saw a boost in their online followings. Average Mohammed's Facebook page likes increased sevenfold, while ExitUSA tripled its number of Twitter followers. Alphabet provided an undisclosed sum to fund the study, while Facebook, Twitter, and YouTube donated around $30,000 in advertising credits, which helped the organizations reach their targeted demographics.
Researchers found that the campaigns successfully sparked online debates as well, generating a total of 484 comments across the three platforms. The comments on Average Mohamed's posts were largely supportive (66 percent), compared to 32 percent for ExitUSA and 50 percent for Harakat-ut-Taleem. Each campaign received negative or antagonistic comments, but the study's authors say that's still a positive sign.
"The fact that someone would feel the need to comment, that goes to show that they're thinking about it," says Tanya Silverman, project coordinator at ISD and co-author of the study. "They're getting exposed to a worldview that's outside of their echo chamber."
"It's hard to evaluate specific behavioral change."
Whether that dialogue will translate into real-world change remains unclear. "It's hard to evaluate specific behavioral change," says Joshua Stewart, strategic communications officer at the Quilliam Foundation, a counter-extremism organization, and Europe coordinator of the Families Against Terrorism and Extremism (FATE) network. "It's rare when someone steps forward and says 'well, I nearly joined the underground but then watched your video.'"
Yet that was the case for eight people who were exposed to ExitUSA's campaign. After watching the videos on Facebook, they approached the organization and asked for help in leaving white supremacist groups, with each citing the videos as a driver behind their decisions.
Silverman and her co-author, Christopher Stewart, hope their work will help establish a framework for other organizations to follow. ISD has been working with Google on counter messaging research since 2014, and is working to develop more accurate analytics that would allow them to gain better insights into the impact that such campaigns have over the long-term.
"very serious promise and potential"
Stewart say it's important for smaller NGOs to have support from both social media companies and governments, which have put increasing pressure on companies like Facebook and Twitter to remove extremist content. Earlier this year, The Wall Street Journal reported that Facebook had begun providing ad credits to users who produce counter-narrative content, including college students and a popular Belgian comedian.
Smaller, grassroots organizations may also be better positioned to produce counter-speech content because they're closer to the ground and don't bear any government affiliations. Earlier this year, the US State Department reshuffled its counter-messaging program, in what The Washington Post described a sign of frustration with its failure to make a dent against ISIS propaganda. Experts say that ISD's findings suggest that with the right support, smaller campaigns can be more effective.
"Critics of [counter violent extremism] programming assert that they are nothing but smoke and mirrors," says John Horgan, a professor of psychology at Georgia State University and an expert on online radicalization, who was not involved in the ISD study. "What this study shows is that not only is that a ridiculous assertion, but that there is very serious promise and potential in these programs."