How Monopolies Harm Content Moderation Online

Industrial organization and content moderation online are inextricably linked, with just a few corporations owning and operating the most important online communications platforms. In a recent paper, I show that those corporations’ market power is at the core of dysfunction in the digital public sphere.

Monopolized but unregulated communications infrastructure erodes public discourse. First, market concentration and platform size increase the stakes of individual content moderation decisions and with that the potential costs of an error. A misjudgment by Mark Zuckerberg, who has the final say on high-stakes content moderation decisions for 240 million American social media users, may have irreparable consequences for democracy. Second, market concentration creates technological and governance monoculture with single points of failure. Design flaws in the centralized architecture can easily have systemic impacts. Facebook’s outage in the fall of 2021 and foreign interference with elections fall into that category. Third, the fewer platforms there are, the easier it becomes for governments to enlist these platforms for censorship. More often than not, the economic interests of digital platforms are compatible with governments’ suppression of dissent. Instead of using their power to push back against authoritarian overreach, digital monopolists have embraced cozy relationships with whoever holds political power.

Monopolized communication infrastructure also harms individual stakeholders. It delivers worse content moderation than more pluralistic arrangements would. Bad content moderation is a classic form of monopoly rent extraction in a barter exchange. Wittingly or not, users trade their data, attention, engagement, and content for platform services. These platform services include, among other things, content moderation. While keeping the monetary prices for platform services at zero, digital platforms can extract monopoly rents by overcharging, lowering the quality of content moderation and, thus, costs. At an individual level, this worsens users’ experiences online; at a systemic level, it harms discourse.

Furthermore, platforms can get away with discriminatory behavior because individuals lack alternatives. It comes as little surprise that the content of marginalized groups gets taken down at disproportionate rates. The absence of credible exit options in turn limits the potential for internal change through voice.

Stricter antitrust enforcement is imperative, but contemporary antitrust doctrine alone cannot provide sufficient relief in the digital public sphere. First, a narrowly understood consumer welfare standard overemphasizes easily quantifiable, short-term price effects. Harm to public discourse is all but impossible to articulate as consumer welfare loss in ways that would satisfy the courts. Second, the levels of concentration necessary to trigger antitrust scrutiny of unilateral behavior far exceed those of a market conducive to pluralistic discourse. Third, requiring specific anticompetitive conduct, the focal point of current antitrust doctrine, ignores the structural dysfunction that bottlenecks create in public discourse.

Three types of remedies may help to address the market power problem behind dysfunction in the digital public sphere. First, mandating active interoperability between platforms would reduce lock-in effects, which enable market monopolization. Second, scaling back property-like exclusivity online might spur follow-on innovation, contributing to more pluralism at the content moderation level. Third, no-fault liability and broader objectives in antitrust doctrine would establish more effective counterweights to concentrating effects in the digital public sphere. While these pro-competitive measures cannot solve all online woes, they would lower the stakes of content moderation decisions, provide incentives for investments in better decision-making processes, and contribute to more pluralistic discourse.

To paraphrase an older line of case law, the current market conditions among digital communication platforms fail at “providing an environment conducive to the preservation of our democratic political and social institutions.” To improve content moderation and fix public discourse, we must first change the underlying industrial organization of the digital public sphere.

This post comes to us from Professor Nikolas Guggenberger at the University of Houston Law Center. It is based on his recent paper, “Moderating Monopolies,” available here.