Meta Cuts Fact-Checkers: What the Policy Shake-Up Means for Free Speech, Democracy, and Sex Workers

January 7, 2025

Mark Zuckerberg announced this morning that Meta will be eliminating fact-checkers and making significant changes to its content policies. In a video posted on Facebook, he outlined the reasoning behind these shifts, which he frames as an effort to combat censorship. Here’s the breakdown:

  • On censorship: “Government and legacy media have pushed to censor more and more. And a lot of this is clearly political,” he stated. He argued that Meta’s systems, while imperfect, could result in millions of voices being silenced due to even a 1% error rate.
  • Fact-checkers replaced: Meta will phase out fact-checkers, replacing them with a “community notes” system akin to X (formerly Twitter). Zuckerberg claimed that fact-checkers have become too politically biased, eroding user trust.
  • Simplified content policies: Restrictions on controversial topics such as gender and immigration will be removed, with policies simplified to reduce ambiguity.
  • New moderation approach: Low-priority policy violations will now require user reports before any action is taken. Additionally, filters for flagged content will require higher confidence levels before content is removed.
  • Political content returns: Civic and political content, previously restricted, will be reinstated to allow more open discourse.
  • Relocating moderation teams: Trust and safety teams, including content moderators, will be moved from California to Texas, which Zuckerberg claims will mitigate concerns about political bias.
  • Partnership with Trump: Meta plans to collaborate with former President Trump to resist government efforts “to go after American companies and push censorship.”

The Implications

While these announcements might sound like a win for free speech advocates, the implications are far more complex. Meta’s platform, with over 3 billion users worldwide, has long struggled to moderate harmful content effectively, arguably intentionally. From misinformation and hate speech to harassment and exploitation, the platform has frequently been criticized for failing to ensure user safety. These policy changes may exacerbate these challenges rather than resolve them.

It’s possible that removing fact-checkers in favor of a community-driven notes system raises questions about credibility. Community notes rely on user contributions to flag and contextualize content, but such systems are prone to manipulation and coordinated attacks, particularly on contentious topics. The erosion of trust in fact-checkers Zuckerburg speaks of doesn’t necessarily mean a decentralized approach will perform better.

Moreover, rolling back restrictions on sensitive topics like gender and immigration could lead to a resurgence of harmful rhetoric, further polarizing users. Simplifying content policies may reduce ambiguity, but it also opens the door for more unchecked content that could harm vulnerable communities.

What does porn have to do with this?

For years, Meta has censored or outright deleted sex workers’ accounts, often under vague or arbitrary guideline violations. I’d argue that sex workers were the first to see censorship, long before the censorship the alt-right speaks of. It’s unclear whether these new policies will provide clarity or further marginalize this community. Historically, attempts to engage with Meta for more information or resolution have been frustratingly ineffective.

In an ideal world, these changes might mean that previously censored content is reinstated—I mean, big-tittied women are profitable, right? However, Zuckerberg’s focus on “bad” content, including child exploitation, gives me pause. Sex work is often unfairly and incorrectly conflated with trafficking and exploitation, making it likely that sex workers’ content will continue to fall under the "bad” umbrella Zuckerberg describes. With the Kids Online Safety Act (KOSA) gaining traction and support from figures like Trump, Elon Musk, and the Heritage Foundation, the future for sex workers on platforms like Meta remains uncertain.

Relocation and Bias Concerns

The decision to move trust and safety teams from California to Texas reflects a growing narrative within tech companies about regional bias. Zuckerberg’s move suggests an attempt to sidestep what he perceives as a liberal bias in California. But let’s not kid ourselves—bias isn’t solved by changing zip codes. Texas has its own entrenched political leanings that will undoubtedly shape how moderation is handled. 

My Take

While Meta has faced legitimate concerns about censorship, these changes seem poised to do more harm than good. The company’s long history of inadequate moderation shows that less oversight and fewer restrictions are unlikely to create a safer or more inclusive platform. Instead, these policies appear designed to maximize profits by capitalizing on divisive, highly engaging content—a strategy that thrives on polarization, not safety.

Zuckerberg has had a real banner year, though not in the way anyone would envy. Meta’s challenges in 2024 have been significant: testifying before Congress, publicly apologizing to victims’ families over cyberbullying tragedies, and facing increased scrutiny from the EU. The European Union’s regulatory environment has undoubtedly fueled his frustration. Meta’s business model relies heavily on targeted advertising, which demands vast amounts of user data. With regulations like the General Data Protection Regulation (GDPR) and the Digital Services Act tightening the reins, Meta has been forced to adapt, paying hefty fines and implementing costly operational changes.

The $840 million antitrust fine didn’t help, nor did the disappointing performance of Threads, Meta’s would-be Twitter competitor. It’s clear that profitability has become the priority, and Zuck is feeling the burn.

But Meta’s challenges aren’t unique—just amplified by its size. With billions of users, the platform has an outsized responsibility to ensure safety, accountability, and fairness. Simplifying policies and removing safeguards might reduce operational headaches, but it risks creating a chaotic and even more harmful environment.

Zuckerberg’s announcement raises critical questions about the balance between free expression and responsible moderation. His push against censorship might resonate with some, but the broader implications—for user safety, public trust, and societal impact—are impossible to ignore.

As Meta forges ahead, it remains unclear whether the platform will thrive or unravel under its new direction. Billionaires like Zuckerberg and Elon Musk distort our worldview through platforms they control, while Jeff Bezos uses his influence to censor journalists. Together, they remind us that democracy is increasingly shaped by their whims and financial gains.

The only certainty? Money will be made. After all, the man needs another chain—or maybe it’s time to erect another statue of his wife. Billionaires: they’re just like us!

***

Help me grow, subscribe to my free email newsletter.