Home > Media News > Facebook Changes its Rules to Create New Groups

Facebook Changes its Rules to Create New Groups
19 Sep, 2020 / 02:18 pm / omnes

1009 Views

Facebook adds new rules to create new groups. Hereafter if a group is removed for violating Facebook’s policies, its members and administrators will be temporarily unable to create any new groups. If a group has no administrators, it will be archived. And Facebook won’t include any health-related groups in its recommendations.

Over the past two years, Facebook has been encouraging more use of groups, catering to the rising amount of conversations shifting out of the main News Feed and into more private spaces.

This trend has been most noticeable in the rise of messaging use - with consumers becoming more wary of the public, permanently recorded nature of main feed sharing, and the potential impacts this can have on, say, your future career prospects, a growing number of people have sought alternative, enclosed spaces to chat and share, without fear of the same potential recourse.

The company’s new rules expand on existing efforts to police the misinformation prevalent in the social media. Admins were already barred from creating a new group similar to a banned one, for instance.

Some of the new policies encourage more active administration of groups. If administrators step down, they can invite members to take their place; if nobody does, Facebook will apparently “suggest” admin roles to members, then archive the group if that fails. Also, if group members accrue a community standards violation, moderators will have to approve all their posts for 30 days. If the moderators repeatedly approve posts that violate Facebook’s guidelines, the group could be removed.

The health guidelines take a broader approach by focusing on an entire category of content, not specific rule-breaking behavior. Facebook says that although groups can “be a positive space for giving and receiving support during difficult life circumstances ... it’s crucial that people get their health information from authoritative sources.”

The company has tried to limit the spread of anti-vaccination content and coronavirus misinformation through other methods already, including adding contextual information to posts that discuss COVID-19 and placing banners on vaccination-related pages. Even so, its size has made it a powerful vector for false health stories.

Facebook also says it’s continuing to limit content from militia groups and other organizations linked to violence. Groups that discuss potential violence will be removed, and it will soon down-rank even non-violating content in the News Feed.

Cynics have also suggested Facebook's groups push may be about moving concerning content out of the public eye - if Facebook can't stop people sharing offensive, divisive content, at least moving it into private groups makes it less visible, and lessens scrutiny on the platform. Facebook is taking action - as part of its evolving detection measures, Facebook says that it can already "proactively detect many types of violating content posted in groups before anyone reports it" whether its posted in public, closed or secret groups.

Source- The Verge