Facebook is pushing groups further into the spotlight, while also introducing new tools to help tamp down unsavory content.
Groups have emerged as a key source of engagement and core part of the platform. But they’ve also come under fire for driving polarization and extremism, as well as for pushing people toward bogus medical cures.
More than half of Facebook’s global users are members of five or more active groups and more than 1.8 billion people connect with groups each month, according to the company.
At Facebook’s annual conference for group administrators on Thursday — which was virtual this year due to the pandemic — the company outlined several updates, including changes that make it easier for people to find new groups and content from groups.
Facebook’s biggest update is bringing groups to the forefront: In the coming months, the company will begin testing ways for more people to discover public groups in their News Feed or even off Facebook while searching the web.
For example, when a link or post about a popular TV show or sports event appears on the News Feed, users might see what other public groups they’re not in are saying about the posts, a feature Facebook is calling Related Discussions. Users can also chime in on the discussion even if they don’t join the group — if the group admin allows it.
Facebook wouldn’t explicitly say if political posts from public groups would be recommended, but said topics would include entertainment, lifestyle, sports, consumer and human interest news, major cultural moments and holidays.
Administrators will have to opt-in to include their public groups in this feature.
Meanwhile under the groups tab, users will now see recommended content from public groups they’re not part of, based on what’s popular and their interests. Until now, Facebook has only surfaced posts from groups you are a member of and suggested groups you might like.
In an effort to make sure these new discovery features don’t surface misinformation or controversial content, Facebook said it will employ both human curators and technology. The company didn’t elaborate further on how content would be surfaced and what might be excluded.
Facebook also announced Thursday it’s taking steps to make groups easier to manage. A new feature called “Admin Assist” allows group leaders to set rules to help them automatically moderate posts. For example, admins can decline posts that include certain words or from users who are new to the group or who have had their content reported in the past.
While the move seems to put the onus on admins to keep groups under control rather than on Facebook moderators and artificial intelligence, VP of Engineering for the Facebook app Tom Alison told CNN Business that the company is taking a “holistic approach” to moderating groups.
“Proactive enforcement and AI are absolutely critical tools in this,” Alison said. “Of course, there’s a role for admins and moderators to play. That’s why we’re investing so much into tools to help groups maintain healthy conversations and set the tone for what they want to talk about.”
As groups have grown into a key way people engage with Facebook, its recommendations have become the target of criticism. Earlier this month, Facebook announced it would no longer recommend health groups. While the company didn’t point to Covid-19 or vaccine misinformation specifically, it did say: “It’s crucial that people get their health information from authoritative sources.” But users can still search for such groups or invite friends to join them.
On Tuesday, more than a dozen advocacy organizations launched a campaign calling on Facebook to turn off group recommendations until the US election results are certified.
Alison wouldn’t say if Facebook would consider pausing recommendations for groups overall or those related to politics ahead of the 2020 election. But he said the company doesn’t recommend groups that repeatedly share misinformation and it fact checks links shared on the News Feed or in groups.
“We know that there’s still a ton of work here to be done,” Alison said. “One of the scenarios that we’re looking at is if the election results are not clear, we’re going to be working with Reuters to make sure that people have an accurate view of what’s going on.”
The company previously announced that it took down about 1.5 million pieces of content in groups for violating its policies about organized hate in the past year, in addition to removing more than 1 million groups for breaking its rules. In the coming weeks, it will also begin archiving groups without active admins, in another effort to tackle groups. Archiving a group means it’s no longer active and members can’t post to it, but previous content can still be viewed.