Meta Ditches Fact-Checking Program, Wants You to Pitch In With Community Notes
Yuan is Turn off third-party fact-checking programs The company said on Tuesday it would make updates on Instagram and Facebook and would instead use Community Notes, a user-driven moderation system similar to the one used by X. It also removes restrictions on topics such as gender and immigration.
“It’s time to return to our roots of free expression on Facebook and Instagram,” Zuckerberg said in the videociting a speech he gave at Georgetown University in 2019 advocating for free speech.
Dating back about a decade, including during Trump’s first administration, Meta implemented a sophisticated content moderation system in response to social and political pressure. Zuckerberg said in the video that things don’t always work out as they should: “The problem with complex systems is that they make mistakes. Even if they accidentally censor just 1% of posts, that’s millions of people. We’re here It got to the point where “there were too many mistakes and too much scrutiny. “
Zuckerberg sees the 2024 US election, in which Donald Trump wins a second presidential term, as a “cultural turning point” and says the company will prioritize by simplifying policies, reducing errors and restoring free speech Consider speech.
The changes come two weeks before Trump’s inauguration, and Meta faces ongoing criticism for its handling of misinformation; accusations of political bias; and criticism of its platform’s broader social impact.
one of the The biggest challenges facing social media companies For the past decade, they have been deciding what content is allowed on the platform and what is removed, including political and medical misinformation and hate speech. Critics have long accused social networks, particularly Facebook, Twitter (now X) and YouTube, of censoring speech. Others, however, say some kind of oversight is crucial in the face of a deadly pandemic and a rising tide of disinformation.
Introduction to community notes
In the US, Meta will now implement community annotations, where users can write and rate annotations to provide context for potentially misleading posts. Joel Kaplan, Meta’s chief global affairs officer, highlighted the system’s safeguards in a blog post, noting that it requires consensus among people with different viewpoints to help prevent bias. Starting today, users can sign up to become contributors.
Meta also plans to adjust the way it enforces policies to reduce review errors. Serious offenses, such as those involving terrorism and child exploitation, will still rely on automation, but less serious issues will require manual reporting before action can be taken.
Kaplan echoed Zuckerberg’s sentiments, emphasizing a “more personalized approach to political content” that would allow users to control what they see.
“Meta’s platform is designed to be a place where people can express themselves freely,” he wrote. “It can be confusing…but it’s free expression.”
Additionally, Meta will personalize how users view political and civic content, reversing the 2021 trend of declining such posts. Kaplan called the approach “stubborn.” Content from followed pages and people will now be ranked based on likes and views like any other post.
Kaplan said the fact-checking program, launched in 2016 to combat misinformation, has evolved into a tool that sometimes stifles legitimate political debate.
“Over time, we ended up discovering so much fact-checked content that people thought it was legitimate political speech and debate,” he wrote. “Our systems then labeled it as intrusive and reduced its distribution. A program designed to provide information too often becomes a censorship tool.”
Zuckerberg said it will take time to implement the new approach properly, and there is still a lot of illegal material that needs to be figured out the best way to remove it. “These are complex systems; they’re never going to be perfect,” he said.