This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

AUSTIN (KXAN) — Social media juggernaut Facebook is currently working on a big revamp of its algorithms that monitor and police hate speech on its platform.

Known as the WoW Project, the developing endeavor will aim to be more active and immediate in wiping slurs and demeaning comments from posts — while prioritizing speech against some communities over others, the Washington Post reports.

While speech against “white people” and “men” will still be considered hateful, it will not be among what’s considered “Worst of the Worst.”

The algorithm will be split up into four quadrants, WaPo explains. Those four areas will be:

  • “People Disagree” — content that’s provocative but not agreed upon as harmful
  • “Men are Trash” — content Facebook says would undermine its efforts to police more harmful speech
  • “Fair Enough” — content that isn’t of the highest priority but should come down
  • “Worst of the Worst” — content consistently considered harmful

Facebook has always monitored hate speech, but it will now not treat comments like “I hate white people” with the same seriousness as comments like “I hate Jewish people.”

“We know that hate speech targeted towards underrepresented groups can be the most harmful, which is why we have focused our technology on finding the hate speech that users and experts tell us is the most serious,” said Facebook spokeswoman Sally Aldous to WaPo. “Over the past year, we’ve also updated our policies to catch more implicit hate speech, such as content depicting Blackface, stereotypes about Jewish people controlling the world, and banned Holocaust denial.”

Facebook’s de-prioritization of anti-white hate speech is also an attempt to keep up with sweeping societal change, as protests and campaigns for racial justice took front and center in the wake of George Floyd‘s in-custody death earlier this year.

Why prioritize hate against certain communities over others?

Facebook’s reasoning for de-prioritizing speech against white people lines up with what scholar Stanley Fish calls a “key distinction” in a 1993 article for The Atlantic.

“Someone will always say, “But two wrongs don’t make a right; if it was wrong to treat blacks unfairly, it is wrong to give blacks preference and thereby treat whites unfairly.” This objection is just another version of the forgetting and rewriting of history,” Fish says.

Fish, who is white, explains that to consider anti-white speech the same as anti-black speech “would be bizarre.”

“… The hostility of one group stems not from any wrong done to it but from its wish to protect its ability to deprive citizens of their voting rights, to limit access to educational institutions, to prevent entry into the economy except at the lowest and most menial levels, and to force members of the stigmatized group to ride in the back of the bus.”

Meanwhile, Fish argues hostility toward white people is the result of actions taken against them.

He says while it’s wrong to treat anyone unfairly, Black people have not been treated just unfairly, saying, “They have been bought, sold, killed, beaten, raped, excluded, exploited, shamed, and scorned for a very long time. The word “unfair” is hardly an adequate description of their experience.”