Meta Ditches Fact-Checkers in Favor of X-Style Community Notes

Estimated read time 3 min read

Meta announced Tuesday that it is abandoning its third party fact-checking programs on Facebook, Instagram and Threads and replacing its army of paid moderators with a Community Notes model that mimics X’s much-maligned volunteer program, which allows users to publicly flag content they believe to be incorrect or misleading.

In a blog post announcing the news, Meta’s newly-appointed chief global affairs officer Joel Kaplan said the decision was taken to allow more topics to be openly discussed on the company’s platforms. The change will first impact the company’s moderation in the US.

“We will allow more speech by lifting restrictions on some topics that are part of mainstream discourse and focusing our enforcement on illegal and high-severity violations,” Kaplan said, though he did not detail what topics these new rules would cover.

In a video accompanying the blog post, Meta CEO Mark Zuckerberg said the new policies would see more political content returning to people’s feeds as well as posts on other issues that have inflamed the culture wars in the US in recent years.

“We’re going to simplify our content policies and get rid of a bunch of restrictions on topics like immigration and gender that are just out of touch with mainstream discourse,” Zuckerberg said.

Meta has significantly rolled back the fact-checking and get rid of content moderation policies it had put in place in the wake of revelations in 2016 about influence operations conducted on its platforms, which were designed to sway elections and in some case promote violence and even genocide.

Ahead of last year’s high profile elections across the globe, Meta was criticized for taking a hands-off approach to content moderation related to those votes.

Echoing comments Mark Zuckerberg made last year, Kaplan said that Meta’s content moderation policies had been put in place not to protect users but “partly in response to societal and political pressure to moderate content.”

Kaplan also blasted fact-checking experts for their “biases and perspectives” which led to over-moderation: “Over time we ended up with too much content being fact checked that people would understand to be legitimate political speech and debate,” Kaplan wrote.

However WIRED reported last year that dangerous content like medical misinformation has flourished on the platform while groups like anti-government militias have utilized Facebook to recruit new members.

Zuckerberg meanwhile blamed the “legacy media” for forcing Facebook to implement content moderation policies in the wake of the 2016 election. “After Trump first got elected in 2016 the legacy media wrote non-stop about how misinformation was a threat to democracy,” Zuckerberg said. “We tried, in good faith, to address those concerns without becoming arbiters of truth, but the fact checkers have just been too politically biased and have destroyed more trust than they’ve created,”

In what he attempted to frame as a bid to remove bias, Zuckerberg said Meta’s in-house trust and safety team would be moving from California to Texas, which is also now home to X’s headquarters. “As we work to promote free expression, I think that will help us build trust to do this work in places where there is less concern about the bias of our teams,” Zuckerberg said.

Source link

You May Also Like

More From Author

+ There are no comments

Add yours