Facebook’s New Penalty System Is Less Harsh but More Informative

Though extreme infractions will still receive bans

Facebook is making another readjustment to its penalty system for rule-violating posts, with the intention of giving users more details about why something is being removed.

Social media platforms regularly play host to questionable content—either as user-made posts or comments—which is why they have content rules and enforcement policies in the first place. And after some self-reflection (with the help of an internal Oversight Board), Facebook is adjusting how it punishes those violations.

Group of friends on smartphones

Maskot / Getty Images

By Facebook's estimation, the new policy will result in fewer immediate account bans that the company believes are often too aggressive for what it says might have been a "mistake." So rather than sending someone to "Facebook jail" for what might have been a poor choice of words, the post or comment will simply be removed, and the user will receive a warning.

That warning will include details about what, exactly, led to the removal decision in the first place in the hopes of coaching users and preventing future violations. Though Facebook still intends to come down hard on major violations (i.e., posts that include child exploitation, non-medical drug sales, etc.).

Facebook warning penalty system illustration

Facebook

Lighter penalties don't mean repeat offenders are off the hook. Facebook will also increase the severity of consequences for seven or more strikes, saying it offers a better balance when dealing with unintentional violations versus intentional ones. Facebook also hopes that not blocking some posts will make it easier to discern policy violation patterns to identify and deal with problematic accounts faster.

These changes to the penalty system begin rolling out today, with Facebook stating that it will continue to analyze and adapt enforcement policies over time. Though, how consistently Facebook follows through with this new approach remains to be seen, as the platform doesn't have the best track record when it comes to moderation.

Was this page helpful?