Where do we even begin with Meta? The harbingers of death? The false prophets of organizing for democracy? A Mr. Monopoly man with far less flamboyancy or charm than the game mascot? However you define Meta, in 2025, they’re disingenuous at best and utterly diabolical at worst. But somewhere in there, there’s incredible nuance—something Sarah Wynn-Williams seeks to shed light on in her new memoir, Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism.
Facebook meant something to me — as I know it did to so many others. I was community organizing in high school, college, and shortly after, working on violence prevention in schools. I oversaw a group of youth organizers across Illinois, pushing for LGBTQ protections, helping students advocate for queer health, history, and policy changes in their districts. In 2011, I — along with a team of incredibly passionate and smart organizers — successfully lobbied for the Prevent School Violence Act, the first of its kind in Illinois to explicitly protect Transgender students and include cyberbullying under its scope.
Facebook was the tool we didn’t know we needed. Sure, there were BBS boards and old-school forums for reaching community, but young people weren’t really there. MySpace was cluttered with follower trains (former scene kid here, rawr xD) and wasn’t made for organizing. But Facebook felt electric. It felt hopeful. For many of us, it felt like change.
And it was — just not in the way we had hoped.

The Tragedy of Nostalgia
Sarah Wynn-Williams, former director of global public policy at Facebook, worked at the company from 2011 to 2017 before abruptly exiting (not on her own accord). In Careless People, she recounts witnessing Facebook putting employees in dangerous situations, ignoring hate speech and sexual assault allegations, destabilizing elections, contributing to the decline of young girls' mental health, and perhaps most damning, offering white-glove surveillance services to the Chinese Communist Party.
The book doesn’t make Meta look good. In fact, it positions them as a central force in global unrest, digital harm, and real-world violence. And yet, none of it feels shocking anymore.
Wynn-Williams paints a tragically nostalgic picture of Facebook in the late 2000s and early 2010s, and honestly, I felt that. I had shut that part of myself off — the part that once believed in these platforms, that thought connection could be revolutionary. But her story cracked something open. It reminded me of just how much I believed in Facebook. I’ve been a cynic for so long it’s hard to remember those days.
Some of the most disturbing moments in the book feel like they’ve barely rippled through public consciousness — not because they’re insignificant, but because we’ve grown so desensitized to this level of harm. Like in March 2016, when Facebook’s Vice President for Latin America, Diego Dzodan, was arrested on his way to work in Brazil. The government had requested WhatsApp data for a drug trafficking investigation, but Meta claimed it didn’t technically control WhatsApp — despite owning it. Zuckerberg messaged Dzodan after the arrest using Messenger, spinning it into a PR moment about Meta’s commitment to privacy. It became another chance to posture as a noble platform, rather than confront the reality: employees' lives and legal systems were collateral in a corporate game. And I know the current tech industry doesn't care much for its employees, but damn Mark.
And then there’s China — a market Meta never officially entered, but actively pursued. Wynn-Williams recounts internal discussions around building out Points of Presence (PoPs) and dedicated servers in China, systems that could have given the Chinese government access to user data. Even users who weren’t located in China but connected with people who were might have had their data stored on those servers. These weren’t vague hypotheticals — they were active negotiations. Eventually, the deal didn’t go through.

Panera Bread Bowls and Burnout at the End of the Internet
Before my January 6 corporate brand crisis, the cracks in Meta were already showing for me. I wrote about that crisis separately, but the real shift came in 2018. I had felt so isolated for so long—I was the only digital person on my team, carrying the emotional weight of hate speech and burnout alone.
By 2018, I had been working in social media professionally for three years. I had transitioned out of community organizing in search of something more creative. I was running digital for a Queer healthcare company, doing consulting for sex workers, and watching the platforms crumble in real-time.
Our paid ads at the queer healthcare company were repeatedly rejected for frivolous reasons. We often highlighted primary care services, specialty care like diabetes management, and more politicized offerings like HIV care. Despite their relevance, our ads were consistently mislabeled as political. We had to sanitize them, often stripping out key parts of our services just to get approved.
At the same time, hate speech was piling up in our comment sections. Vile content flooded our DMs. Our Meta reps offered little to no support around moderation or escalation. The platforms themselves became nearly impossible to use.
I remember this moment at the Creating Change Conference in 2018, trapped in the fluorescent grip of the Detroit Marriott. There’s a certain level of hell associated with attending a corporate conference surrounded by a sea of ill-fitting polyester suits. Nevermind that I’d have to submit my expense reports and explain to finance how I stress-ate five Panera bread bowls in four days. I was sitting with 15–20 other social media managers working for LGBTQ centers and queer brands, and every single one of us had similar experiences. Meta’s platforms had become unusable. We were all burned out, traumatized, and isolated in our work.
I’ll say this: the isolation wasn’t entirely Meta’s fault. It was also our employers’ consistent lack of nuance around what social media managers actually do. Thankfully, I’ve been able to transition into an equally toxic but far better-compensated industry: tech.

Two Truths at Once
Wynn-Williams presents herself as an observer in Careless People — someone trying to steer Facebook toward better choices, cautioning leadership, advising on policy. And I believe some of that. But I also think it’s more complicated. She wasn’t just watching; she was in the room. She witnessed it all. And while she shares how disturbed she was by much of what happened, I wanted more from her. More reflection. More clarity. This wasn’t just about profit margins — we’re talking about Meta offering surveillance infrastructure to an authoritarian regime. There is complicity here, and I wish that had been more directly acknowledged.
I’ve seen some of the criticism of the book — Kara Swisher and Scott Galloway on Pivot called it Joe Rogan–style sensationalism, which I found to be rather unfair and rude. Memoirs aren’t white papers. They’re lived experience. They’re shaped by memory, perspective, and emotion. That’s how she experienced it. That doesn’t make every detail immune to scrutiny, but it also doesn’t invalidate the story.
I think what I ultimately wanted was for Sarah to go even deeper. I imagine putting out something like this is incredibly difficult. You’re exposing yourself to the criticism of the world — and Meta, in particular, has made their feelings known. They’re furious she didn’t submit the book for internal fact-checking prior to publication. An interesting stance from a company that has done away with fact-checkers in favor of Benson Boone’s Grammy jumpsuit.
But for all the layers of conflict here, I still come back to this: what does it mean to have been part of something harmful? What does it look like to live with that? To reckon, not just report? I want that reflection. What has life looked like post-Meta?

The Human Cost of Community
At the heart of this book is community moderation — and the way it's so often overlooked, underfunded, and deprioritized.
One story that stuck with me was the situation in Myanmar. In 2014, hate speech targeting the Rohingya Muslim minority was spreading across Facebook. Wynn-Williams and her policy team flagged the issue early, only to be told that a single Burmese contractor had been hired to manage it — and that everything was fine.
Months later, a Facebook post went viral, falsely claiming that a Buddhist woman had been raped by a Muslim man. The story was fabricated, but it didn’t matter. It had already been widely shared, including by Ashin Wirathu — a Buddhist monk often referred to as “the Burmese bin Laden” for his long history of inciting religious violence. A mob formed, violence erupted, and on July 4, 2014, the Junta blocked Facebook entirely.
The moderation team in Dublin insisted the posts didn’t violate Facebook’s community guidelines — not because they had reviewed and approved them, but because no one on the team could read Burmese. Eventually, the sole contractor was reached — but he was out to dinner and unable to access his laptop. Hours passed. In the end, Wynn-Williams bypassed the usual chain of command and escalated the issue to a team in California, who removed the content. Facebook was unblocked within minutes. But by then, the damage had already been done.
This is what happens when moderation is treated as an afterthought. When violence is filtered through a corporate spreadsheet. When community is reduced to “content.”

The Power of a 24-Hour Disappearing Story
People think platforms “make it” because of features. I think platforms win because of community. Sorry to all the product and engineering teams working their asses off on whatever fresh hell of a feature is coming next.
Twitter became unusable when it started platforming Nazis. Instagram lost its soul when it stopped feeling real. And Facebook? Facebook killed its own community when it abandoned the chronological feed, cannibalized every feature it could from its competitors — most notably Snapchat Stories — and handed full control over to the algorithm.
Community moderation isn’t sexy. It’s not a growth hack. Without thoughtful, human-centered moderation, there is no community. There is no safety. There is no truth. Just whatever the platform decides you should see — and whoever pays the most to be seen.

Gaslight, Gatekeep, Global Collapse
I think there’s a lot more Mark Zuckerberg could be doing—especially given the obscene amount of resources at his disposal. Careless People makes it clear: there were countless chances for him to make different choices. But he didn’t.
And that’s the thing about social media: it’s fake things pretending to be real, shaping our lives in deeply real—and often harmful—ways. In some cases, it’s violence. In others, it’s the slow erosion of how we see ourselves, our habits, our bodies, our worth. It’s all become one massive gaslight blur.
So I keep asking: how do we get out of this? And who’s responsible?
Of course, there’s personal responsibility. We need to start fact-checking more, reflecting on what we’re choosing to believe, remembering that social media is always curated—whether by creators selectively showcasing their lives, or by tech oligarchs deciding what’s relevant to us.
We’ve become way too reliant on platforms as arbiters of truth. Whether it’s TikTok, Twitter, Instagram, Facebook—whatever. None of them are built for truth. They’re built for engagement. And the two are not the same.
And then there’s institutional accountability. The U.S. government, in particular, needs to start treating this seriously. Mark and his peers know exactly how much influence they have—not just on American culture, but global democracy. And yet, so many elected officials still think a Finsta is a threat to national security. They need to care more. Learn more. Do more.
Not in a “let’s deplatform trans people for existing because we think they’re indoctrinating children” kind of way. (Looking directly at you, Kids Online Safety Act.) But in a “let’s actually regulate data collection, algorithmic bias, targeted ads, hate speech, and moderation infrastructure” kind of way.
But hey—what do I know? I’m just the girl behind some of your favorite global brand accounts.
What I do know is this: we all need to examine the role social media plays in our lives. How have we contributed to the community? To the chaos? To the toxicity? How does it live in us? And how do we take even a little bit of that space back?
In my own work, I’ve tried to shift how I operate in this space. I consult clients on reducing ad spend on platforms like Meta and X—though it’s not easy. I push for diversification: decentralized platforms, creator partnerships, loyalty programming, even testing underused platforms like Pinterest or Nextdoor (which still have their own moderation messes, by the way).
No one has moderation figured out yet. Pinterest might be the closest.
I’m also trying to personally disengage more. Spend less time on these apps. Have more real conversations with people who, like me, are building their careers inside the machine.
I don’t know what’s next for social media.
But I’m heartbroken by what it’s become.
The thing I once loved—this chaotic, weird, connective digital world—has been twisted into something that causes harm. I don’t know how to fix it.
But I know we have to keep talking about it.
****