In June, the Internet was ablaze with fiery rage when ProPublica published a story detailing Facebook’s leaked policies regarding how the company determines what is — and what isn’t — considered offensive, and therefore ban-worthy.
The logic behind Facebook’s use of “protected groups” and “subsets,” whichUproxx‘s Dan Seitz details here, didn’t do much to alleviate the general sense of anger at the time. What’s more, it does little to explain why Facebook is still reportedly banning women for saying things like “men are scum” and “men continue to be the worst” in the wake of the #MeToo movement, but their trolls are not likewise punished. According to The Daily Beast, that is precisely what is happening on the platform — even when the troll accounts who respond in droves to such sentiments threaten physical violence. When Boston comedian Kayla Avery wrote “men continue to be the worst” because she “felt helpless to stop their hate,” Facebook banned her account for 30 days. Noting that it was her third such ban, Avery explained, “There was one guy who was threatening to find my house and beat me up. I got banned before I could even successfully report it.” The Daily Beast reached out to Facebook for comment:
When reached for comment a Facebook spokesperson said that the company is working hard to remedy any issues related to harassment on the platform and stipulated that all posts that violate community standards are removed.
When asked why a statement such as “men are scum” would violate community standards, a Facebook spokesperson said that the statement was a threat and hate speech toward a protected group and so it would rightfully be taken down.
In other words, the platform’s previously leaked policy that gives “men” status as a “protected group” still applies, though the spokesperson did “[clarify] that this is because all genders, races, and religions are all protected groups under Facebook’s current policy.” The women that reporter Taylor Lorenz interviewed suggested “internalized misogyny on the behalf of Facebook’s content moderation team” was to blame. Since over one billion people use Facebook, whose team of moderators numbers 7,000, it’s more likely that “things slip through the cracks.”