Women like Jocelyn Kopac joined the Boss-Moms Facebook group to network and chat about life as both a mom and business owner, and considering the group has more than 50,000 members from around the world, life inside has been relatively mundane. Up until now, that is, while the Black Lives Matter movement is picking up steam around the US and the world. Over this past weekend, Kopac says the group’s leader, Dana Malstaff, and her team of moderators deleted people’s posts about Black Lives Matter, turned off comments, and failed to respond to members’ subsequent concerns.
One now-deleted post mentions how disappointed a member is in the group’s leadership. “I hope that Dana and her team rethinks and learns from many of us who just get it,” this person wrote. Another deleted post asked the moderators to stop deleting.
“They were just going through censoring and policing the group heavily for anything that had to do with the movement, or how people were feeling, or the changes they were looking for, or anything like that,” Kopac says. The fallout was swift, with people splintering off, and in some cases, members encouraging people to join different groups where Black Lives Matter content is accepted and discussed.
One post, captured in screenshots, asked the moderators to issue an “official position” on police brutality because they’d been deleting posts. Instead of formally responding with a position, the mods restated that they do not allow political posts and turned off comments.
“I feel sad that I witnessed what I just witnessed,” Kopac said to her camera during a Facebook Live on her personal page addressing the group drama.
People screenshotted everything and sometimes left the group
Boss-Moms is one of many Facebook groups grappling with inadequate moderation policies as members attempt to discuss Black Lives Matter. The groups, which range in focus from video games to music to local communities, are moderated by other group members. The moderators have no formal training from Facebook or outside sources and make their own decisions about what content is and is not allowed. Most groups have no reference point for how to give everyone a voice, and that’s led to fighting between members, people leaving, groups temporarily shutting down, and splinter groups breaking off.
Many groups don’t have people of color as moderators, adding to the moderation problem. Roop Mangat belongs to a pair of local community Facebook groups — one in which no people of color or women are moderators, and another that has a white woman moderator who serves alongside only white men. Mangat posted the same message to both groups, urging people to take racism in the community seriously. One of them deleted it. “They think anything ‘political’ is not appropriate,” she tells me over Twitter DM. “Yet they still allow posts that include gossip and false information to spread.” The group threatened to ban her if she posted again.
“I just think it’s sad that communities are trying to censor this to make it seem like we are a perfect society instead of trying to tackle the issues,” she says.
Boss-Moms didn’t have any women of color as moderators until this incident and had rules in place that banned political posts and outside links. Malstaff, the group’s creator, went live on Facebook to address members’ concerns about conversation being shut down, but that just made things worse. She said Black Lives Matter content, and all civil rights content, isn’t allowed because it produces “heated” conversations.
Facebook group moderators are in charge of conversation
Malstaff and her team later said they’d post “prompts” where women of color could share their stories, as well as outside links. Kopac says she and Malstaff chatted, and Malstaff admitted to needing more education and resources. But then Malstaff went live again and, according to Kopac, said she doesn’t “see” color. That Facebook Live has since been deleted.
“People are saying this is a political problem,” Kopac says. “It is not a political problem. It’s a human problem.”
Facebook CEO Mark Zuckerberg has said groups play a fundamental role in the future of the platform and its goal of fostering “community” and bringing “the world closer together.” In 2017, the company said more than 1 billion people around the world used groups. “An important part of delivering on our new mission is supporting group admins, who are real community leaders on Facebook,” wrote Kang-Xing Jin, Facebook’s vice president of engineering at the time.
The company has given moderators more education to produce fruitful, but difficult, conversations, including a landing page with best practices, as well as online training courses on specific topics like COVID-19. But some of the advice, like sticking to stated rules, seems to be what’s tripping moderators up now. Facebook’s own internal moderation rules have also caused problems for its platform, so it’s no wonder there’s trouble when the task is left to regular people. After this story went live, the company published a list of recommendations for group admins to consider, specifically during this time. It suggests creating a more “diverse” set of moderators and reconsidering rules around political speech by defining the topics that are and aren’t allowed.
In some cases, people are leaving groups in protest. Sarah, a member of a fan group dedicated to the musician Hozier, says she left the group after a moderator posted that political content was banned. The moderators said they’d ban people who talk about the movement, and they’d delete posts, too. Sarah says she voiced her concerns about this policy in a post that was then deleted.
“I immediately left the group because I was not interested to stay in a group where, first of all, your voice is not heard,” she said. “And, I mean, Hozier is an activist, so I think it’s completely out of context if we have to shut up about it.”
Hozier, who’s unaffiliated with the group, has tweeted links daily to Black Lives Matter content and resources. The moderator’s post says people can share Hozier’s direct content, but that if people want to post their own political thoughts, they should join a separate, unmoderated group. Sarah says at least 50 other people left the group, which has around 3,000 members.
Group moderators say keeping everyone happy isn’t easy. They recognize they’re in control of conversation, though, and what’s allowed in the group caters to their own beliefs.
“I’ve been accused of this group being my own personal agenda,” says Abby Hartman, a group admin for a local community in Minnesota. “People say, ‘please change the name to Abby’s group,’ because I’ll post Black Lives Matter and that’s the response I get.”
Hartman started her own neighborhood group, which has grown to 300 people, after being banned from another local group for sharing information about nearby vaccine clinics.
“There are two groups, and I’m in the tiny one that gets all the people that are banned from the other one, so I probably have more extreme people in my group, even though I feel like I’m pretty normal,” she says. “It’s attracted all the people who are mad at the other one.”
She only bans hate speech or threats, but during the Black Lives Matter movement, she started banning “all lives matter” content because she considers the memes people shared to be hate speech.
“I think moderators need some sort of training on how to process information, and then also how to deal with different walks of life,” Hartman says. “I don’t know what that would be. If I were to write about it, I would have to do quite a bit of research because I really don’t know. It’s all new.”
Moderators want more training
Mods even argue among themselves sometimes, says Selene, a mod for a Facebook group dedicated to Star Wars.
In her group, she says someone posted a video of Star Wars actor Pedro Pascal at a protest in Los Angeles. Some mods wanted to delete the post because they considered it political, which goes against the group’s rules, but Selene wanted it kept up.
“I kind of wasn’t up for that conversation because I’ve been out protesting, and it’s not something that I wanted to argue about that day, so I put my foot down, and I wasn’t gonna let him remove it,” she says. “We could moderate comments; we could watch them if anything went wrong; we could delete comments, but we’re keeping it up.”
She still has to fight to keep content up because the other mods don’t want to incite arguments, she says. Because the mods in her group have to approve every post before it goes live, she says some people are venting about the group not allowing them to post their political thoughts while still keeping BLM content up.
“I wish that Facebook would actually have some rules as far as their Facebook groups, so we didn’t always have to enforce and make our own,” Selene says.
The Boss-Moms group, after everything played out, is making its own changes, too. Last night, days after the post deletions started, Malstaff posted an extensive action plan to the group, which includes: opening moderator positions up with the goal of giving women of color more power and a voice in the group; creating an “advisory committee” to help make decisions about how the group is run and who they bring in for podcasts and potential panels; hiring Kopac to give a live training about being an ally and anti-racist; highlighting a “Boss Mom of the week” with June’s focus being on black members; and reviewing their Boss Mom content to “make changes that show diversity.”
“Up until last week, we were running the group and the community, which is part of my Boss Mom business, as we had always run it with the same rules, and this week has helped to open our eyes to the need to celebrate our black women in the community more,” Malstaff tells The Verge. “To make everybody feel like it’s a safe, welcoming place. And right now, for our black women, we want to make sure that they feel as if [they] have a voice, and we can unite everybody in the group together to really help make change, and that is important to me.”
Kopac says the changes go far enough if Malstaff “implements it completely and continually.”
Other groups’ moderators have reached consensus around welcoming Black Lives Matter discussions. A group dedicated to Animal Crossing: New Horizons players in Sweden, for example, ran into similar issues when some players staged a virtual walkout in which they dressed their characters in Black Lives Matter outfits. Farhia, a member in the 2,500-member group, says most people in the group supported other players sharing their photos, but some said it mixed politics with fun and that the group should stick to Animal Crossing talk.
“Somebody wrote, and I quote, ‘Nothing bad about Black Lives Matter and what’s happening in the US, but this is the Animal Crossing Facebook Group, we should focus on the game. I personally don’t want to talk about a lot of political stuff. There’s other forums for this,’” she says. People, including moderators, commented back saying opting out of the conversation is a privilege.
The moderators, Farhia says, have done a decent job and are “active and positive.”
Since the incident, however, Farhia says she deleted her Facebook account, not only because of the arguments in the Animal Crossing group, but also because of Zuckerberg’s failure to delete US President Donald Trump’s call for violence against protestors. Employees of Facebook virtually protested that decision, as well. If Zuckerberg can’t even determine when a post should stay live, without following his own stated rules, how are moderators supposed to do better?
Update 6/6, 10:58 AM ET: Updated to reflect that Facebook issued new guidance for moderators and admins.