‘Too Politically Biased’
Meta introduced the fact-checking program in 2016 after Donald Trump won that year’s presidential election.The organizations that were part of the program, including Snopes, were able to analyze posts. If they decided a post contained false information, it would either get flagged, with a fact-check note accompanying it, or removed.
“That’s not the way things played out, especially in the United States,” he wrote. “Experts, like everyone else, have their own biases and perspectives. This showed up in the choices some made about what to fact-check and how.
Shift Toward X Model
Meta has been copying some functions of social media platform X, formerly known as Twitter, in recent years. Meta’s Threads, first introduced as a video messaging application, was later retooled to serve as an X-style platform for short bursts of thought. Zuckerberg also rolled out a premium version following Elon Musk’s rollout of Twitter Blue.Facebook’s latest move is replacing the fact-checking program with an X-style community notes feature drawn from users.
“We’re going to get rid of fact-checkers and replace them with community notes, similar to X,” Zuckerberg said.
Kaplan said X’s approach has worked.
“They empower their community to decide when posts are potentially misleading and need more context, and people across a diverse range of perspectives decide what sort of context is helpful for other users to see,” he said. “We think this could be a better way of achieving our original intention of providing people with information about what they’re seeing—and one that’s less prone to bias.”
Applying for Notes
Community Notes on X uses contributors, who offer assessments of certain posts that they feel offer misleading information.Even some of Musk’s posts have received Community Notes.
X explicitly states that it doesn’t rely on professionals, but rather views regular people as valuable contributors.
Zuckerberg said in 2016, before the fact-checking program was introduced, that Facebook historically had relied on users to help the company understand what was fake and what was not, but that the problem had become so complex that working with fact-checking groups was necessary.
Praise Comes—With Some Opposition
Some panned the development.On the other hand, Musk was not the only person to praise the changes.
“The First Amendment protects social media companies’ editorial choices about the content on their platforms,“ Ari Cohn, lead counsel for tech policy for the free speech group FIRE, told The Epoch Times in an emailed statement. “But it’s a good thing if platforms voluntarily try to reduce bias and arbitrariness when deciding what content to host—especially when they promise users a culture of free speech like Meta does.”