CEI - Competitive Enterprise Institute

01/08/2025 | News release | Distributed by Public on 01/08/2025 14:31

Hayek on Facebook’s community notes

Meta is going to stop using professional fact-checkers for Facebook posts. My colleague Jessica Melugin is relieved that Meta is finally publicly acknowledging that jawboning, or political pressure, has influenced its content moderation decisions.

Other commentators seem to believe that removing professional fact-checkers means removing fact-checking itself from Facebook. This post is for them.

As Jess points out, every content moderation system has tradeoffs, and none are perfect. This includes Twitter-style community notes, which Facebook will soon adopt. But there is reason to believe that community notes will be a better overall fact-checking system than hired fact-checkers. The economist F.A. Hayek had some insights into why that might be. He believed that decentralized, competing solutions are often better than a single centralized plan.

A peer-to-peer system like community notes has three advantages over centralized fact-checkers.

One, professional fact-checkers lack credibility because people think they are biased. This limits their ability to persuade skeptics. People whose posts get fact-checked often falsely cry censorship (only governments can censor). They prefer to play the grievance card than admit they were wrong about something.

Two, professionals have a limited reach, because there are only so many of them. Facebook users put up 293,000 status updates every minute. The vast majority of posts with misinformation never get fact-checked, because nobody can possibly keep up with them all. Automating the process with algorithms has its own problems, such as flagging posts that have done nothing wrong.

Three, professionals might have expertise in a few areas, but a billion-person community like Facebook will have experts in all kinds of niche issues that a small band of hired fact-checkers will know nothing about. Moreover, these niche experts are usually delighted to find opportunities to talk about their interests and will often offer their services for free.

That is similar to how Wikipedia works. In fact, Wikipedia founder Jimmy Wales was directly inspired by Hayek's emphasis on decentralization.

Brookings Institution scholar Jonathan Rauch's book The Constitution of Knowledge: A Defense of Truth gives a brilliant explanation, starting on page 139, of how social norms and institutional design work together to make Wikipedia as accurate as Encyclopedia Britannica, while defusing ideological conflict.

While Wikipedia isn't perfect, it's a good sight better than Twitter or Facebook. On page 144, Rauch even foresees the community notes model back in 2021:

Social media do not need to be like Wikipedia, but they can learn from Wikipedia. They are already doing so. Twitter, for example, debuted a new community feedback tool called Birdwatch-modeled on Wikipedia-allowing qualified users to identify and annotate tweets which they think are false or misleading, and also to rate the quality of other participants' annotations.

Social media will likely always fall short of the civility norms we expect in face-to-face conversations, so let's not expect that type of improvement. And plenty of community notes will contain false information. But community notes have fewer bias and credibility problems than professional fact-checkers, and bad ones are easier to fix than a centralized fact-checker's final decision.

Everyone is biased, whether they are professional fact-checkers or community notes writers. That's not the problem. The problem is that most professional fact-checkers are biased in the same direction.

Individual Facebook, Twitter, and Wikipedia users' biases are all over the map. Those biases can check and balance each other in an ongoing decentralized process of back-and-forth conversation. Free speech isn't a thing or a place, it's an ongoing process, similar to the way Hayek viewed markets. Free speech is a conversation that never ends. Community notes are one way to have that conversation. Despite their flaws, they are likely a better moderating tool than centralized fact-checkers.

News Release

Judge gets Google antitrust ruling wrong

A federal judge ruled today that Google violated antitrust law, declaring "Google is a monopolist" in online search. Competitive Enterprise Institute antitrust, legal, and economic…

Tech and Telecom

National Review

AI Could Make the Google Court Decision Moot

In a decision by the District Court of the U.S. District of Columbia, Google has been found guilty of monopolizing its leadership in…

Tech and Telecom

The Washington Times

KOSA is a poor substitute for parenting

Good parenting was always a lot of work, but guarding kids' online mental health has added to the parental load. Not every problem has a…

Tech and Telecom