Facebook changes privacy option for closed and secret groups | Internet

Facebook has long offered three privacy options for groups: public, closed and secret. Now, to make things easier, the social network will work with only two categories: public and private.

Facebook changes privacy option for closed and secret groups

In a note signed by Facebook Groups product manager, Jordan Davis, the company reports that the change was made to better match users’ expectations and to make things clearer.

Facebook’s decision does not affect public groups, which will continue with publications available to anyone on the social network. They will also continue to let third parties know who their participants are.

Meanwhile, closed and secret groups, which do not allow anyone to participate, will be classified as private. As before, they don’t let other people know what was shared or who the participants are.

The difference between the two formats is in search visibility. Closed groups could be found by anyone, who could request their entry. Secret groups, on the other hand, did not appear in searches and the entry of new participants only happened through invitations.

The options are still available, but under different names. Now private group administrators must indicate whether the group will be visible or not in the search. By default, closed groups will become visible private groups. Secret groups will become hidden private groups.

“Having two privacy settings (public and private) will help make it clearer who can find the group and see the members and publications that are part of it,” said Facebook. The company said it would continue to remove inappropriate content in public and private groups.

“In recent years, we have invested a lot of resources in people and technology,” said the company. “We hired more than 30,000 people for our security teams.” They act to identify and remove publications and groups that break the rules of the platform.

Facebook uses artificial intelligence and machine learning to detect potentially negative content before anyone even reports it. When something is flagged, employees consider the context and, if the material violates the rules, it is taken down and used to train the algorithm.

With information: Facebook.

Leave a Comment