Facebook Launches New Admin Tools To Moderate Bad Behavior in Groups

Facebook has launched a new series of admin tools designed to help group moderators manage communities more easily, the company announced on Wednesday.

Most notably, the social media platform introduced an AI-backed feature called “conflict alerts,” which is currently in testing. The new type of moderation alert will notify group admins when the AI detects “contentious or unhealthy conversations” in a group.

As part of a new comment moderation feature in Admin Assist, moderators can “slow down” conversations by limiting how often certain members can comment on a post and controlling the frequency by which comments can be made on a post. Admins will also have the ability to prohibit new members from posting or commenting in a group.

Additionally, Facebook streamlined all of its admin tools in an intuitively designed “Admin Home,” where moderators can quickly locate tools and tailor the layout to meet their specific needs. A new member summary will provide admins with a consolidated outline of each group member’s activity, with reports including the number of times they have posted and commented or when their posts have been removed.

The social media platform integrated the new functions based on feedback from group admins, acknowledging that moderators are “at the heart of Facebook’s mission of building community.” The company also added that there are currently more than 70 million active admins managing Facebook Groups.

Post a comment

Your email address will not be published. Required fields are marked *