menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Deplatforming Backfired

3 0
22.12.2025

Social Media

Zach Weissmueller | 12.22.2025 11:15 AM

When Donald Trump was kicked off social media in 2021, liberal pundit Matthew Yglesias tweeted, "It's kinda weird that deplatforming Trump just like completely worked with no visible downside whatsoever." Two years later, Fox News fired Tucker Carlson, and Rep. Alexandria Ocasio-Cortez (D–N.Y.) celebrated that "deplatforming works," though she worried about the long-term implications.

"I also kind of feel like I'm waiting for the cut scene at the end of a Marvel movie after all the credits have rolled, and then you like see the villain's hand reemerge," said Ocasio-Cortez.

As it turns out, every major star gets a sequel. Trump is back in the White House, and Carlson has a bigger audience than ever before.

What we've learned is that deplatforming doesn't work.

In 2021, I published a video at Reason predicting this backfire effect, comparing the media ecosystem to Freud's theory of the unconscious:

Sigmund Freud theorized that when thoughts or experiences are repressed, they inevitably resurface in more deranged and damaging forms. When our dominant communication platforms seek to repress widely held beliefs and opinions, those beliefs and opinions aren't likely to simply disappear but rather reemerge elsewhere.

What we've learned since the Great Deplatforming of 2021, and the subsequent rise of extremist commentators like Nick Fuentes, is that the best way to exorcise our demons is to confront them head-on.

It's understandable why progressives thought they'd won the platform wars in the early 2020s. Remember when Amazon Web Services deplatformed Parler, the right-wing social media network? Or when Facebook suppressed and Twitter blocked a completely accurate New York Post story about Hunter Biden's laptop?

Remember when the New York Times' Kevin Roose wanted President Joe Biden to appoint a "reality czar," and for a moment it looked like that was actually going to happen?

Remember when scientists from elite institutions were shadow banned for expressing opinions about COVID-19 that turned out to be correct? Or when the Stanford economist and physician Jay Bhattacharya had his Twitter account secretly throttled for, among other things, saying that the lockdowns were counterproductive?

Back then, things were looking pretty bleak for those of us who still subscribe to John Perry Barlow's idealistic 1996 vision of the internet as a place "where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity."

And then, three years ago, Elon Musk bought Twitter and everything changed.

Self-described free speech "absolutist" Musk invited journalists to examine the "Twitter Files," exposing how the federal government had coerced private companies to suppress critical speech. That's how we learned that Bhattacharya's Twitter account had been secretly throttled.

The National Institutes of Health (NIH) Director, Francis Collins, had called Bhattacharya a "fringe epidemiologist," requesting a "quick and devastating" takedown of his heretical views.

Bhattacharya ended up taking Collins' job as director of the NIH. And Martin Kulldorff, a Harvard professor fired for refusing the COVID-19 vaccine, is now chief science officer for U.S. Health and Human Services.

The "villain's hand" Ocasio-Cortez worried about not only reemerged, but has a full grip on the levers of power.

Musk replaced Twitter's third-party establishment sources serving as "fact checkers" with the community writ large. When Biden tweeted that "the 28th Amendment is the law of the land," a community note was appended observing that the 28th Amendment doesn't exist. The New York Post had its account restored after it was vindicated on the Hunter Biden laptop story, although it did get a community note for sharing a fake bigfoot video.

The success of this crowd-sourced approach to fact-checking inspired Mark Zuckerberg to adopt a similar system.

"I was really worried from the beginning about basically becoming this sort of decider of what is true in the world," Zuckerberg said on The Joe Rogan Experience in January. "That's, like, kind of a crazy position to be in for billions of people using your service."

Community notes is a 21st-century instantiation of John Stuart Mill's concept of a marketplace of ideas, where thinkers clash, compete, and arrive at a consensus. It's a superior method for getting at the truth than simply trusting institutional gatekeepers.

"When people who usually disagree on something agree, they get higher ranked," Zuckerberg explained. "You're showing more information, not less."

The vibe shift was real. Banned accounts were reinstated. A different type of content began to dominate X's "For You" algorithm.

And people started to share their real preferences.

In his 1995 book Private Truths, Public LiesDuke University political scientist Timur Kuran introduced the concept of "preference falsification," where........

© Reason.com