The U.S. has long exported its culture abroad — think Coca-Cola, Hollywood and hip-hop. Facebook was once praised for spreading free-speech values. But the world is pushing back with different values, which Facebook is importing to the U.S. with the company's .
The ban goes into effect worldwide . Experts and advocates who've long lobbied Facebook to pull potentially dangerous speech say the move — which is wrapped up in the U.S. culture wars — is the result of international pressure forcing the company's hand.
A Facebook spokesperson says that under the new rules, users can't post in a celebratory way on its °µºÚ±¬ÁÏ Feed or Instagram: "I'm a white nationalist!" But they can post: "I'm a black nationalist!"
John Spier, a Facebook user in Central California, says that's "ridiculous." Spier, a self-described libertarian, says everyone should have freedom of expression. "Even if they're an idiot. There's a lot of idiots in the world who say a lot of stupid things. We don't need to protect people from that," he says.
What racism is and who can be racist is a debate that's getting louder in the U.S. Spier says that while Facebook claims to be a neutral platform, the company is taking the liberal side.
"I know that the current popular mode of thought is that only white people can be racist," he says. "But I don't agree with that. I grew up as a minority white person in a largely Latino community and believe me, I know what racism feels like."
(Note: Facebook is among NPR's financial sponsors.)
According to Facebook leaders and civil rights advocates, this issue is not about speech but about safety. It's a well-documented fact: White extremists around the world are , luring them into organized hate groups and promoting lone-wolf acts of terror.
Heidi Beirich, director of the Southern Poverty Law Center's Intelligence Project, has been tracking extremist groups online for years, and making this point to Facebook: "White supremacists are as much a global movement and interconnected — in other words sharing ideas, sharing money, sharing tactics, sharing propaganda, visiting each other ... just like you see with Islamic extremists."
Dylann Roof, who murdered nine people in a church basement in Charleston, S.C., in 2015, got . The white nationalist rally in Charlottesville, Va., in 2017 that left three dead and dozens injured, on a Facebook page. The shooter charged with attacking two mosques in Christchurch, New Zealand, killing 50 people, — an incredibly powerful broadcast tool.
Beirich says Facebook's latest move is in reaction to the PR disaster in New Zealand, as well as pressure from law enforcement — particularly those in Europe, who are worried about white extremist gunmen.
"That realization is dawning on the intelligence communities worldwide, and Facebook is hearing it from them," she says.
A spokesperson tells NPR that the tech company made the unusual move of adding the Christchurch massacre footage to a terrorism database that had been focused on Islamic extremism. The spokesperson says the company will continue to add white extremist content to the , which tech giants share to censor the most violent content.
Australia just approved that threatens social media employees with prison time if they don't remove violent content " ." The United Kingdom is . Germany has passed tough hate speech laws that carry .
"The U.S. is behind the eight ball on this," Beirich says. "[President] Trump doesn't seem to be interested in these issues at all. And I think Facebook is reacting to that in a good way, I would argue."
About 90 percent of Facebook users are outside the U.S. and the largest single market is now .
Last week, Facebook CEO Mark Zuckerberg around the world to create, in effect, a global standard for speech. That has never existed before. It's a long shot. But as Zuckerberg sees it, that's what needs to be engineered next.
Copyright 2020 NPR. To see more, visit https://www.npr.org.