After years of growing discontent, social platforms’ moderation problems are starting to hit the bottom line. For the past week, Facebook has been struggling to contain a growing advertiser boycott, which has seen major corporations like Unilever, Diageo, and Coca-Cola swear off any advertising on the platform for the rest of the year.
On Monday, the panic spread to other platforms. Reddit banned a range of subreddits — including the notorious /r/the_donald community — as part of a new policy against identity-based hatred. YouTube banned longtime white supremacists Richard Spencer and David Duke, along with their affiliated organizations. On Twitch, President Trump was temporarily banned after rebroadcast speeches were flagged as inciting racial hatred.
Most of these bans and policies were a long time coming, but there’s a reason they all hit at once. For months, a coalition of civil groups — including Color of Change, Sleeping Giants, and the NAACP — have been pushing advertisers to drop support for platforms that allow racial hatred to spread unmoderated. Called #StopHateForProfit, the campaign was initially focused on Facebook, but it has grown to focus on social media at large during a global movement against racist police violence.
The Verge spoke with Jade Magnus Ogunnaike, Color of Change’s deputy senior campaign director, about what the campaign wants from tech companies and how they can get ready for the long haul.
This interview has been lightly edited for clarity.
So far, #StopHateForProfit has mostly been pushing for advertisers to drop Facebook, in particular. But on Monday, we saw a lot of other moderation moves that seemed connected to that campaign, dropping prominent white supremacists like Richard Spencer and David Duke. Do you see that as a result of the recent campaign’s success?
I would connect a lot of it to demands we’ve had long before #StopHateForProfit. We’ve been pushing Twitter to kick Trump off the platform since 2018 because he uses it to spread violence and hatred. And it’s the same thing on /r/the_donald subreddit. There are a bunch of other ones that also got pulled down yesterday. This stuff leads to real-world consequences for Black people and marginalized communities. So we met with Reddit’s CEO as part of our coalition to discuss their policies. We still need full details about how Reddit is interacting with this and what their actual moderation structure is. We haven’t seen any internal documents either. But their new policy is a big step for a major social media community standard to reject online hate through a civil rights lens.
The big issue overall — and it’s an issue with Facebook, with Twitter, with Reddit, with Twitch — is how you build out the infrastructure to do this all the time. You can make these little changes, removing subreddits or putting a label on Donald Trump’s tweets, but until these social media platforms actually have civil rights infrastructure and staff who are trained to assess products and content for bias and discrimination, they’re going to continue to see these problems over and over again.
So we think Twitch is a really great start. Twitch is the first platform to actually kick the president off. Reddit is making some good steps. But what’s really important is that we have this civil rights infrastructure in place and have people who are really trained to look at products and content and figure out the best ways to make them safe for Black and marginalized communities.
I’m curious what you would say about a company like Facebook, which has participated in civil rights audits and has this civil rights task force around the upcoming US election, but doesn’t seem to be doing any better on these issues.
So you’re right. Facebook has done civil rights audits, which they did at the urging of Color of Change and other civil rights organizations. The results from it should be released soon, I believe, but they have not been released.
But for a company in that position, we don’t need a task force. What we need is C-suite level executives with civil rights expertise to evaluate the ways that the policies and products around discrimination and bias and hate are affecting users. We need a person who is dedicated to ensuring that the design decisions of the platform take into account the impact they’re having on marginalized communities and the potential for radicalization and hate.
I should also note that the Facebook task force is not dedicated staff. The task force dedicated to the election. But come November 4th, that task force will be over. And we know that the hate on Facebook is going to impact Black people long after the 2020 election is over.
As Sleeping Giants said yesterday, civil rights groups have literally been asking for some of these bans for years. So why is it happening now, all at once?
I think we’re in a watershed moment for racial justice. You can chalk it up to a number of different events. Part of it is because of the way that government has failed us and corporations have failed us in the wake of the coronavirus. But also, we saw Eric Garner. We saw Sandra Bland. We saw Michael Brown. Black organizers in the movement have been pushing our culture to this point. Six years ago when Michael Brown was killed, you wouldn’t see McDonald’s saying that Black Lives Matter. It was a radical statement. And the credit completely goes to organizers in the movement that have pushed the culture to the point where we can no longer ignore police and vigilante killings of Black people.
Of course, corporations still have a long way to go. Saying “Black Lives Matter” is great, we like the statement, but it’s totally symbolic when Black workers are fired for speaking up about racism, or corporations like Facebook refuse to invest in civil rights infrastructure, or when Twitter lets Trump incite violence against protestors for speaking out. They undermine their statement with their actual actions, and that’s a lot of what the #StopHateForProfit campaign is about.
One of the concerns I hear about deplatforming from the left is that it’s building a playbook that could be used against activists. Right now, Color of Change is pushing Facebook to get rid of white nationalist groups. But then are you worried, a few years down the road, some right-wing group could use this same approach to ban Black Lives Matter groups?
When we’re talking about deplatforming, we’re intentionally looking for groups and content that incite violence and harm people. So you look at something like Unite the Right in Charlottesville two years ago. There was this massive white supremacist march around these confederate monuments, and the next day, a woman was literally killed. So we’re not looking to deplatform people who simply have differing opinions. We’re looking to remove information that is harmful. And that could be vaccine misinformation or violent conspiracies like Pizzagate that end up sending people to the pizza shop and shooting. These are not just people with different opinions. These are people who are violent and intend to cause harm to other people.
Richard Spencer really embodies the simple rebranding of white supremacy. Most people can understand why you wouldn’t want someone in a KKK hood spreading violent misinformation on your platform. That makes perfect sense to most people because the hood is so symbolic of hatred and terrorism and violence. But because Richard Spencer has on a button-up shirt and hard shoes and gelled hair, all of a sudden, he becomes nonthreatening, and we’re encouraged to look at both sides of the issue. In reality, he’s pushing the same hatred and misinformation that the KKK pushes. It’s just rebranded in a new outlet.
I also want to point out that Black organizers are already deplatformed constantly on Twitter and Facebook and other social media spaces. Black organizers saying “please don’t kill us” or “defund the police” are often moderated more heavily than actual white supremacists saying that Black people are genetically inferior. So I don’t think it’s an apt comparison at all.
Where do you think the movement goes from here? What should companies do to try to fix these problems in the long term?
There’s no quick fix. And that’s why companies are quick to do the symbolic thing. You know, we removed some subreddits or we released a statement. It’s a great step, but there are no quick fixes for the intersection of racism and capitalism. There are no quick fixes for companies like Reddit that have been steeped in racist culture from the very beginning. We can’t just cheer on the quick things. Companies need to actually undergo civil rights audits. They need to look at how racism and discrimination are showing up at every level in the company.
There’s no one thing you can do to fix racism in your company or to affirm that Black lives matter. What actually has to happen is that companies have to commit to a living wage for all of their employees, and then they need to invest in civil rights audits and take it step by step to implement those changes. As much as Black people wish that we could snap our fingers and have companies affirm Black lives — not only in death but also in life — that’s simply not going to happen. Companies have to be in it for the long haul.