Skip to main content

Facebook will start surfacing some public group discussions in people’s News Feeds and search results

Facebook will start surfacing some public group discussions in people’s News Feeds and search results

/

The company is pushing people to join Groups

Share this story

Facebook

Facebook is expanding the reach of public groups today with new features that could lead to more people engaging in group discussions, but also potentially more visibility for dangerous or nefarious communities. The company announced multiple updates today for Groups that include automating moderation and covering people’s News Feeds with group discussions.

The most intriguing update is starting out as a test at first. Facebook says it’ll start surfacing public group discussions in people’s News Feeds. These can show up if someone shares a link or reshares a post. Beneath that link, people will be able to click to see relevant discussions that are taking place about that same post or link in public Facebook groups. The original poster can then join the discussion even without joining the group.

Recommended groups will also show up in the group tab if Facebook deems them relevant to people’s interests. Additionally, public group posts will start showing up in search results outside of Facebook, effectively giving them more reach and a much larger audience. Taken altogether, these updates set public groups to grow fast, which could backfire if extremist groups or communities spreading misinformation are promoted. Facebook says any posts marked false by a third-party fact checked won’t be eligible to be surfaced through these features.

People are going to see a lot more Groups content

Public groups could get bogged down with trolls or people who don’t care about the community a group is trying to foster as this relevant discussion feature rolls out. Admins will be able to set rules to not allow people who aren’t members to post, or require them to be in the group for a certain amount of time before posting, and Facebook is helping moderators keep track of this potential content influx.

It’s launching a new feature called Admin Assist that’ll allow mods to set rules and have Facebook automatically enforce them. For instance, certain keywords can be banned or people who are new to the group might not be allowed to post for a certain amount of time, and instead of flagging these posts for moderators to approve or deny, Facebook will automatically handle them. For now, the kinds of restrictions moderators can set are limited, says Tom Alison, VP of engineering at Facebook. Moderators can’t, for example, set a rule about having no “politics” in the group, which has been a controversial rule over this past summer with the Black Lives Matter movement gaining momentum in the US and around the world.

“Over time, we’ll be looking at ways to make this more sophisticated and capture broad actions that maybe the admins want to take, but for now what we really focused on were some of the most common things that admins are doing and how we can automate that, and we’ll be adding more things as we learn with the admin community,” Alison says in an interview with The Verge.

Facebook’s admin assist feature.
Facebook’s admin assist feature.
Facebook

It’s hard to see how conversations will stay productive with these new features when people share links to political content. The relevant discussions could lead down a dark rabbit hole and introduce people to extreme content and ideologies from groups they never expected to engage with and might not realize are sharing misinformation or conspiracy theories.

Facebook has already said it’ll continue limiting content from militia groups and other organizations linked to violence, but the company has struggled to define the boundaries of offending content — including posts from a self-described militia group in Kenosha, Wisconsin, where a 17-year-old militia supporter killed two people during a night of protests. The company also recently deactivated 200 accounts linked to hate groups.

Facebook has struggled to keep hate groups off the platform

Still, in addition to all these updates, the company also says it’ll offer an online course and exam for moderators to help them understand how to “grow and support” their community.

As for product features, Facebook is bringing real-time chats back to groups, and is launching Q&A sessions and a new post type called Prompts that ask people to share a photo in response to a prompt. Those prompts will then become a swipeable slideshow. People will also be able to customize their profile for groups, meaning they can set a custom profile photo or adjust what info they share based on the group. (Someone in a group for dog lovers might want to set a photo of themselves with their dog as a profile photo, for instance.)

Moderators’ jobs are becoming more important for Facebook — they’re the main gatekeepers for content, so keeping them empowered and informed is key to Facebook getting groups right, especially as Groups start showing up all over the platform.