Facebook will begin to remove all Groups content that has been submitted by people who have violated the platform's policies. This is a move to prevent rulebreakers reaching others in a community. It builds on existing policies that prohibit them from commenting or inviting others.
Facebook announced in a blog that it would also include a Flagged by Facebook function that displays content flagged for deletion. Administrators have the option to either remove the content or request a review. This is to get administrators involved before Facebook issues a strike that could impact the group.
Since the US 2020 presidential election, Facebook has been paying more attention to certain groups. They were being used to spread false information about voting. Facebook is also under increased scrutiny for extremism and other harmful content due to documents that were leaked by Frances Haugen (an ex-Facebook employee), who testified recently before Congress. Facebook stated earlier this week that it expects a number of new stories, based on thousands upon pages of leaked documents.
The company described the reports as a coordinated gotcha campaign. However, it also promoted its efforts at reducing hateful and false content and offering more transparency around moderation. The September update provided details about problematic content, but did not completely remove it. This included posts from accounts that had violated its rules.