Facebook introduces tools to keep groups safe

In a post by Tom Alison, VP of Engineering at Facebook, the company has introduced a few tools to help regulate the content that is shared on the social media site. According to the blog post, the recent additions in group moderation are a part of an ongoing effort to improve the quality of content shared on groups. Using Artificial intelligence, machine learning, and content reviewers, Facebook has been trying to remove hate-speech and toxic content from its platform. The tools introduced by the company will help the group admins monitor the content on social media platforms.

Facebook has become a breeding ground for fake news, misinformation, and hate speech. Because it is so easy to share content on social media platforms, the platform has often perpetrated toxic content in groups. Participation in such groups causes toxicity to spread among other members. Hence, in a bid to crackdown on fake news and toxic content, Facebook introduced these tools for group moderators as well as members to ensure that groups do not violate Facebook’s Community Standards.  

From posting inappropriate content to writing fake reviews, private groups are a nuisance. Many fear that private groups are used to brainwash and radicalize members. ProPublica reported about a group of Border Patrol agents who joked about the death of migrants in a private Facebook group. The group was created in 2016 and has around 9,500 members according to the article on the website.

In addition to this, private groups can be a source for disseminating fake news, much like groups on WhatsApp and other social media platforms. Often entire groups on the platforms are dedicated to spreading sexist, and racist views on the platform. To avoid admins form doing so, Facebook’s guidelines regarding groups were mentioned in the blog post by Tom Alison.

According to Tom Alison, the activity of the moderators and admin of the group determines whether a group should be taken down or not. If the moderator or the admin approve a post that is against Facebook’s guidelines, then Facebook has the authority to take the group down. A post flagged by Facebook that violates the Community Standards will then have to be reviewed by the admin. If the admin still approves it, there are chances that the group will be removed.

Other indicators that Facebook has taken into consideration regarding the intent of a group is the name and the description of the group. If the name or the description perpetrates hate-speech, the group may be taken down.

Keeping these guidelines in mind, Facebook has strengthened its hold on the content that is shared in private as well as public groups. Because hate-speech targets a particular community and can endanger the safety of Facebook users, the company introduced tools to helps moderators and members.

Tools for admins

The tools, introduced by Facebook, are meant to empower the admin of the group in ensuring that members of the group are in line with the Community Standards. These so-called tools are meant to empower the admin in a group to view the activity of the groups. Larger groups are difficult to handle. With these tools, admins can get an overview of the activity. Moreover, this can help them in keeping their groups free from toxic content. These tools include the following:

  • Group Quality: An overview of the content that Facebook has removed due to violations of Community Standards
  • Section for rules: Rules set by the admin in a group that members are expected to follow
  • Section for false news: Lists down false news shared in groups

Tools for members:

The blog post by Tom Alison also explained how tools for members helped them in knowing a group’s intent before joining it. The potential members are provided with details such as the name of the group, admin, and moderators, in addition to other names that the group has had in the past.

With these new changes, Facebook intends to keep users (especially members in groups) in line with the Community Standards of the platform. You can read Facebook’s Community Standards here.

Another post by Jordan Davis, Product Manager for Facebook Groups, explained how groups will now be either private or public. In addition to this, private groups will either be hidden or visible, while the public group will be visible. This allows groups to be either closed to an exclusive group of people or be open to anyone. These new privacy settings for groups hope to provide more control on “how their groups are discovered.”

Both the posts mentioned that the company is working on the Safe Communities Initiative. The initiative by Facebook intends to protect members in a group from harmful content. This hopes to reduce the spread of fake news by identifying it early on. This is done by reducing the outreach of groups that spread misinformation and by using third-party fact-checkers to check the reliability of a news story.

These new changes to the platform will ensure that the admin of a group are held accountable for the content that their members share. The tools afforded to the admin will help them in making sure that no member is out of line. This will hopefully make the groups on Facebook safe from hate-speech against a community and group of people. Although the two blog posts claim to increase the scrutiny that each Facebook post goes through, it will be some time before we see these changes making a difference.

Facebooktwitterredditpinterestlinkedinmail