Meta announced Tuesday that it will abandon its third-party fact-checking programs on Facebook, Instagram and Threads, and replace its army of paid moderators with a mandatory Community Notes model. imitate X The volunteer program has a lot of maliceAllows users to publicly flag content they believe is inaccurate or misleading.
IN a blog post In announcing the news, Meta's newly appointed director of global affairs, Joel Kaplan, said the decision was made to allow more topics to be discussed openly on the company's platform. ty. This change will first impact the company's censorship in the US.
“We will allow more speech by lifting restrictions on certain topics that are part of mainstream discourse and focusing enforcement on illegal and low-level violations,” Kaplan said. high severity,” Kaplan said, although he did not detail what topics the new rules would cover.
In a video accompanying the blog post, Meta CEO Mark Zuckerberg said the new policies will cause more political content to return to people's feeds as well as posts about other issues. sparked a culture war in America in recent years.
“We will simplify our content policies and remove a series of restrictions on topics like immigration and gender that are no longer relevant to mainstream discourse,” Zuckerberg said.
Meta does come back significantly fact-checking and scrapping the content moderation policies it introduced following revelations in 2016 of influence operations carried out on its platforms, designed to influence influence elections and in some cases promote violence and even genocide.
Before last year's high-profile global election, Meta was criticized for adopting a non-interventionist approach to moderate content related to those votes.
Recalling comments Mark Zuckerberg made last yearKaplan said that Meta's content moderation policies were put in place not to protect users but “partly in response to political and social pressure for content moderation.”
Kaplan also criticized fact-checking experts for their “biases and perspectives,” which lead to excessive moderation: “Over time, we have had too much content checked so authentic that people would understand them to be legitimate political speeches and debates,” Kaplan wrote.
However WIRED reported last year that dangerous content such as Medical misinformation has flourished on the platform while groups such as Anti-government militias used Facebook to recruit new members.
Meanwhile, Zuckerberg blamed “legacy media” for forcing Facebook to implement content moderation policies after the 2016 election. “After Trump was first elected in 2016,” Zuckerberg said , traditional media have written endlessly about how disinformation is a threat to democracy.” “We tried to address those concerns in good faith without becoming arbiters of truth, but the fact-checkers were too politically biased and destroyed more trust what they created,”
In what he tried to frame as an effort to eliminate bias, Zuckerberg said Meta's internal trust and safety team will move from California to Texas, which is also currently where X is headquartered. “As we work to promote free speech, I think it will help us build trust to do this work in places where there are less concerns about group bias,” Zuckerberg said. I”.