Facebook Decides Holocaust Denial Content Is Bad, Actually

Facebook has, for years, intentionally looked the other way when users shared content that denied or distorted the Holocaust.

That may finally be changing.

Facebook’s vice president of content policy, Monika Bickert, said Monday the company is updating its hate speech policy to prohibit Holocaust denials and distortions. 

A “well-documented rise in anti-Semitism globally and the alarming level of ignorance about the Holocaust, especially among young people,” Bickert said, prompted the long-overdue change.

The announcement makes no mention of how Facebook itself contributed to that anti-Semitic rise. The social media platform has become a clearinghouse for misinformation concerning virtually every subject, including Holocaust denials and anti-Semitism in general.

It’s unclear how Facebook intends to enforce the expanded policy, or how it will define content that violates it.

“Enforcement of these policies cannot happen overnight,” Bickert acknowledged in the announcement. “There is a range of content that can violate these policies, and it will take some time to train our reviewers and systems on enforcement.”

The company told HuffPost it will apply the policy to all of its users, including politicians. Politicians have enjoyed lax enforcement of Facebook’s community standards, thanks to a loophole the social media company created that protects their posts as “newsworthy content.”

Facebook had resisted calls to take down Holocaust denial content going back to at least 2011, when 21 Holocaust survivors pleaded with the company to deny access to users who promoted the conspiracy theory that Nazis didn’t murder 6 million Jews during WWII.

“By allowing this hate propaganda on Facebook,” the group warned the company in a letter, “you are exposing the public and, in particular, youth to the anti-Semitism which fueled the Holocaust.”

Facebook, at the time, nevertheless decided Holocaust denials didn’t violate its terms.

A Facebook spokesperson declined to address why the company suddenly reversed course, but it’s clear Facebook is coming to terms with the role it’s played in radicalizing groups, sometimes with violent ends.

After a teen vigilante was accused in the shooting deaths of two protesters in Kenosha, Wisconsin, over the summer, Facebook acknowledged it erred in taking no action against militia groups that used the platform to plot violence and encourage each other to “shoot to kill.”

In recent weeks, the company has also drastically increased its policing of QAnon content, and now bans all groups, pages and Instagram accounts that openly identify with the Trump-supporting conspiracy theory. The FBI has classified the group as a potential domestic terrorism threat.

Source Article