There was a time when Mark Zuckerberg did not consider the mainstream media an enemy. He even allowed me, a card-carrying legacy communications employee, into his house. In April 2018, I ventured there to hear his plan to do the right thing. That's part of how I joined Facebook over the years to write a book. Over the past two years, Zuckerberg's company has been heavily criticized for its failure to rein in misinformation and hate speech. Now, the young founder has a plan to solve this problem.
Part of the solution, he told me, There has been more content censorship. He would hire more people to check posts, even if it cost Facebook a significant amount of capital. He will also increase efforts to use artificial intelligence to proactively remove harmful content. “Giving people the tools to say what they want and then just having our community flag them and try to respond after the fact is no longer enough,” he told me when we I sat in his solarium. “We need to get more involved and play a more active role.” He admits he was slow to realize how harmful Facebook's toxic content was, but he is now determined to fix the problem, even though it may take years. “I think we're doing the right thing,” he told me, “We just should have done it sooner.”
Seven years later, Zuckerberg no longer thinks more moderation is the right thing to do. IN a five-minute storyhe described his move to support it as a regrettable concession to government criticism on Covid and other topics. He announced a shift away from content moderation—proactively taking down and demoting misinformation and hate speech—and ending a fact-checking program aimed at refuting lies is going viral on his platform. Fact-checking by trusted sources will be replaced by “crowd annotation,” a crowdsourcing approach in which users provide alternative views on the authenticity of articles. post. That technique is exactly what he told me in 2018 was “not enough.” While he admitted now his changes would allow “more bad things,” he said that by 2025, it was worth having more “free speech” flourishing.
The policy change is one of many moves that indicate that, whether Zuckerberg wants to do it or not, Meta is positioning itself in sync with the new Trump administration. You've heard the litany, which has become a meme in itself. Meta promoted its top lobbyist, former Republican operative Joel Kaplan, to director of global affairs; he immediately appeared on Fox News (and only Fox News) to introduce the new policies. Zuckerberg also announced that Meta would move content writing and review staff from California to Texas to “help alleviate concerns that biased employees are over-moderating content.” He disbanded Meta's DEI program. (Where is Sheryl Sandberg, who is so proud of Meta's diversity efforts? Sheryl? Sheryl?) And Meta has changed some specific terms of service to allow users degrading LGBTQ people.
It has now been a week since Meta's change—and my first time at Zuckerberg's speech—I was particularly haunted by one aspect: He seemed to downplay the basic practice of classical journalism, characterizing it as no better than unreported observations from podcasters, influencers, and countless random people on his platform. This was alluded to in his Reel when he repeatedly used the term “legacy media” as a slur: a force that in his view promotes censorship and stifling freedom of speech. All this time I thought the opposite!
A hint at his revised version of credibility comes from the move from fact-checker to community notes. It's true that the fact-checking process doesn't work well — in part because Zuckerberg didn't defend the checkers when malicious critics accused them of bias. It is also reasonable to expect community notes to be a useful signal that a post may be erroneous. But the power of refutation fails when participants in the conversation reject the idea that disagreements can be resolved by convincing evidence. That's the core difference between fact-checking—which Zuckerberg has eliminated— and the community that notes he's implementing it. The fact-checking worldview holds that certain facts, arrived at through research, talking to people, and sometimes even believing one's own eyes, can be concluded. The trick is to recognize those authorities who have won the public's trust by pursuing the truth. Community notes welcome alternative viewpoints—but it's up to you to judge which viewpoints are trustworthy. There is something that says the antidote to bad speech is more talk. But if verifiable facts cannot successfully refute flapdoodle's easily disproved ones, then we will be stuck in a babelian suicidal quicksand.
That is the world that Donald Trump, Zuckerberg's new role model, has deliberately aimed to realize. 60 minutes reporter Leslie Stahl once asked Trump why is he insulting reporters who are doing their job. “Do you know why I did that?” he replied. “I did it to discredit all of you and bring you all down so that when you write negative stories about me, no one will believe you.” 2021, Trump Reveal more His intention is to benefit from the attack on the truth. “If you talk enough and keep talking, they will start to believe you,” he said during a rally. A corollary to that is that if social media promotes enough falsehoods, people will believe them too. Especially if previously recognized authorities are discredited and demeaned.