Site icon Techolac – Computer Technology News

Facebook Makes Changes to Keep Groups Safe By Sanctioning Rule

Despite losing in popularity to some of its competitors, such as Instagram, TikTok, and Twitter, Facebook remains the biggest social media platform in the world, with 2.74 billion active users.

Throughout the years, Facebook has been the subject of several scandals, most of them revolving around the spreading of fake news through the social media platform, to influence people’s perception of certain world events such as elections, vaccinations, and the pandemic.

In the past few years, the company has been taking part in a constant game of whack-a-mole, trying to crush bad members, pages, and groups that spread misinformation, but when one is squashed, others will rise to take its place.

On March 17, Facebook announced its new plan to change its Groups experience and clean up the platform by limiting the reach of potentially harmful users or groups. That’s because, as evidence shows, people are not only using Facebook groups for propaganda but also to organize threatening events. One example are the member of a group that plotted to kidnap Gretchen Whitmer, Michigan’s Governor, which used Facebook groups to organize.

Facebook has been a constant presence in the news, including last month, when the platform had a harsh response to a new Australian law. The News Media Bargaining code required Facebook to remunerate the news publishers for posting their content on the platform. As a response, Facebook blocked users in Australia from sharing or viewing news content.

What are Facebook’s plans?

Earlier this year, the platform announced they would start removing civic and political groups from recommendations for US users, but the change is now going to be made worldwide. What’s more, Facebook brings on a new series of changes that aim to prevent the spread of harmful content and make it difficult for groups that spread misinformation to be recommended to other users.

Whether the group is public or private, the platform is committed to taking it down if it breaks the rules. The most recent changes are part of Facebook’s efforts to make the platform safer and punish those who are breaking the rules. We can expect to see these changes roll out on a global level in the months to come.

Below, we try to explain what these changes mean and how users can help stop the spread of misinformation.

Better recommendations

Based on their interests and interactions with the platform, Facebook recommends specific content to its users. It can be ads, people they may know, pages that may interest them, or groups where they can find likeminded people. When the platform does that, it becomes responsible for the type of recommendations they do, which is why they need to come up with better solutions to separate harmful groups from high-quality ones and penalize those who don’t respect the rules.

Facebook has acknowledged they need to do more to stop the spreading of potentially harmful groups and balance their recommendations guidelines so that groups that follow the rules are not overshadowed by wrongdoers.

As a result, Facebook has started removing civic and political groups from recommendations for US users. The same thing applies for new groups, precisely because there is no way to know for sure these newly created groups are safe. While people can still invite their friends into these groups or search for them by name, the platform won’t recommend them.

In the following months, this change is going to expand globally, together with other restrictions such as removing health groups and groups that share misinformation from recommendations as well.

Facebook also introduced a negative feedback loop type of system, so when a group breaks the rules, they will be shown lower in recommendations, making it more difficult for people to stumble upon them. A similar thing happens in News Feed, where posts that Facebook considers low-quality are showed at the bottom so that fewer people discover them.

Reducing privileges for the rule-breakers

Group members that violate Facebook rules will find they have more and more of their privileges reduced. The more they continue to break the rules, the more restrictions they will face until they end up being completely removed from the platform. In more severe instances, Facebook warns they will remove the groups and people involved without hesitation.

When a person wants to join a group that has violated Community Standards, they will be warned so that they are aware of the issues and make an informed decision whether to still join the group or not. For the group’s existing members, Facebook will start showing the group’s content less and less, or lower in the feed. This will make it harder for people to engage with these communities and prevent them from being encouraged to break the rules.

If the group has a considerable number of people breaking the rules, Facebook will require group admins and moderators to review and approve posts before they are published. If admins and moderators do not comply with this and continue to accept content that violates the rules, the group will be removed.

As for users, when a person continues to violate rules in groups, they will be blocked from posting or commenting on any group for a certain period of time. Their ability to invite other members into these groups will also be removed so that they can’t reach other people with their harmful behavior.

Statistics show more than half of the platform’s users belong to at least 5 groups. It is without a doubt that Facebook wanted to make groups a big part of our social media activity when it started recommending content from groups that people are not members of, but may be interested in. Now, as this particular aspect of our social life is shown to be able to do a lot of harm as well, it is Facebook’s duty to protect its users against it. With these rules, Facebook is trying to limit the spread of misinformation and minimize real-life consequences.

Exit mobile version