Facebook has added new features for Groups aimed at helping reduce the amount of misinformation shared among group members, parent company Meta announced. One of the options allows Group admins to auto-decline posts that third-party fact-checkers have determined contain false information so that the post isn’t shown to other members within the Group.
This has been a big problem for Facebook; since many Groups are private, harmful or incorrect information can be spread quickly and with little oversight. Groups have been blamed for helping boost the visibility of COVID-19 misinformation and other conspiracy theories and for providing a place for bad actors to formulate plots to kidnap Michigan’s governor and coordinate parts of the January 6th insurrection.
Facebook has taken some steps to try to rein in users who violate Group rules and in punishing Groups that violate its rules. It also added tools for Group admins last year, allowing them to limit how often some users can post and alerting them to conversations that may be “contentious or unhealthy conversations” (although exactly how its AI would do so wasn’t clear). But as with most of its attempts to try to get a handle on Groups that spread misinformation or otherwise violate its policies, most of Facebook’s fixes have arrived very late to the party, often reacting well after problematic content has gone viral.
In addition to letting Group admins reject some content from being posted, Facebook expanded the functionality of its “mute” feature, renaming it “suspend,” so admins and moderators can temporarily suspend members. The company says the new tools will let admins manage Groups more efficiently and give them additional insights about how to grow their Groups with “relevant audiences.”