Zuckerberg Explains to Joe Rogan Why Facebook Censored the Hunter Biden Laptop Story

(Patrick Carroll, Foundation for Economic Education) In a recent Joe Rogan podcast, Meta CEO Mark Zuckerberg explained the chain of events that led Facebook to censor the Hunter Biden laptop story and how that censorship took place.

The story, which many argue was a key factor in the 2020 election, was infamously banned by Twitter shortly after it came out. At the time, Facebook announced they were “reducing its distribution” on their platform, but offered minimal details as to how or why.

In late October of 2020, Zuckerberg explained in a Senate hearing that the FBI had warned Facebook to be on “heightened alert” about “hack and leak operations” leading up to the election, which is part of what caused them to temporarily throttle the story.

Though Zuckerberg did not divulge any particularly new information with Rogan, the conversation is causing quite the stir and has increased public attention on the federal government’s role in social media censorship.

“Basically the background here,” Zuckerberg said, “is the FBI I think basically came to us, some folks on our team, and was like ‘Hey, just so you know, you should be on high alert. We thought that there was a lot of Russian propaganda in the 2016 election. We have it on notice that basically there’s about to be some kind of dump similar to that, so just be vigilant.’”

Zuckerberg stressed that Facebook didn’t outright ban the story like Twitter did, but only reduced its distribution.

“For the…I think it was five or seven days when it was basically being determined whether it was false, the distribution on Facebook was decreased, but people were still allowed to share it. So you could still share it, you could still consume it,” Zuckerberg continued.

“So when you say ‘the distribution is decreased.’ How does that work?” Rogan asked.

“Basically the ranking in news feed was a little bit less. So fewer people saw it than would have otherwise,” Zuckerberg replied.

“By what percentage?”

“I don’t know off the top of my head. But it’s…it’s meaningful.”

After commenting on the “hyper-political” nature of the story, Zuckerberg reiterated how the FBI’s comments were a key factor in the decision.

“We just kind of thought, ‘hey look, if the FBI’—which I still view as a legitimate institution in this country, it’s very professional law enforcement—‘if they come to us and tell us that we need to be on guard about something, then I want to take that seriously.’”

“Did they specifically say you needed to be on guard about that story?” Rogan asked.

“No, I don’t remember if it was that specifically,” Zuckerberg replied, “but it basically fit the pattern.”

For years, content creators have wondered whether Big Tech platforms like Facebook have been silently censoring their content. With these platforms being such black boxes, it’s impossible to say for certain why some content goes viral while other, equally-interesting material seems to struggle. Rumors have long circulated that “the algorithm” favors some stories and accounts more than others, but without access to the back end this has been difficult to prove.

Even with Zuckerberg’s recent comments, we still don’t know much about how Facebook engages in censorship or exactly what kind of content is being censored. But if we can read between the lines of his comments, we can deduce this much for certain: there is a group of people who control which content gets proliferated, and if they are even suspicious of your content, that’s going to hurt.

This alone should be chilling. The idea that a panel of “experts” is deciding which stories become national news and which ones never take off is deeply troubling. Yes, misinformation exists, but the dangers of allowing a small, likely-biased group to control the national conversation are far greater than the dangers of allowing “wrong” or “misleading” information to spread.