Dark Mode
Image
  • Sunday, 22 December 2024

Facebook testing AI to get users to stop fighting in its groups

Facebook testing AI to get users to stop fighting in its groups

Facebook testing AI to get users to stop fighting in its groups

The social network is testing the use of AI to spot fights in its many groups so group administrators can help calm things down.

The announcement came in a blog post Wednesday, in which Facebook rolled out a number of new software tools to assist the more than 70 million people who run and moderate groups on its platform. Facebook, which has 2.85 billion monthly users, said late last year that more than 1.8 billion people participate in groups each month, and that there are tens of millions of active groups on the social network.
 
Along with Facebook's new tools, AI will decide when to send out what the company calls "conflict alerts" to those who maintain groups. The alerts will be sent out to administrators if AI determines that a conversation in their group is "contentious or "unhealthy", the company said.
 
For years, tech platforms such as Facebook and Twitter have increasingly relied on AI to determine much of what you see online, from the tools that spot and remove hate speech on Facebook (FB) to the tweets Twitter (TWTR) surfaces on your timeline. This can be helpful in thwarting content that users don't want to see, and AI can help assist human moderators in cleaning up social networks that have grown too massive for people to monitor on their own.
But AI can fumble when it comes to understanding subtlety and context in online posts. The ways in which AI-based moderation systems work can also appear mysterious and hurtful to users.
A Facebook spokesman said the company's AI will use several signals from conversations to determine when to send a conflict alert, including comment reply times and the volume of comments on a post. He said some administrators already have keyword alerts set up that can spot topics that may lead to arguments, as well.
If an administrator receives a conflict alert, they might then take actions that Facebook said are aimed at slowing conversations down — presumably in hopes of calming users. These moves might include temporarily limiting how frequently some group members can post comments and determining how quickly comments can be made on individual posts.

Comment / Reply From