DCN - Digital Content Next

03/19/2026 | News release | Distributed by Public on 03/19/2026 06:38

Social media laws should focus on social media

In California, jurors heard testimony that echos far beyond the courtroom as a warning for the digital age. In a major social media liability case that concluded last week families described how their children slipped into patterns of compulsive use that preceded a serious mental health crisis. In one case, a teenage girl spent hours each night scrolling algorithmically curated feeds, pulled back again and again by notifications and social validation. Over time, her parents said, that behavior led to isolation, anxiety, and depression-outcomes that mirror a growing body of research on social media's impact on adolescents. The plaintiffs argue that these outcomes are not accidental. They are the predictable result of features designed to maximize engagement.

These stories are increasingly common. They are the reason dozens of lawsuits, investigations, and public health warnings have converged on a troubling conclusion: social media companies have built platforms deliberately designed to capture and hold the attention of young users at any cost, to maximize profit.

Engineered for attention, but at what cost?

Features such as likes, notifications, and algorithmic feeds create feedback loops that keep users coming back. They deliver small bursts of social validation that can make it difficult, especially for younger users, to step away. As Sean Parker, Facebook co-founder, famously acknowledged, these platforms were engineered to provide users with "a little dopamine hit every once in a while" so they keep coming back.

For teenagers whose brains and social identities are still developing, these design choices can have profound consequences. Studies increasingly link heavy social media use among adolescents to anxiety, depression, and body image concerns. Filters, curated images, and constant comparisons can intensify feelings of inadequacy, while endless scrolling and late-night notifications disrupt sleep and emotional well-being. Many of these lawsuits allege Meta, owner of Facebook and Instagram, had the most knowledge of these impacts but chose to suppress it. That's one of the reasons it's why courtroom arguments create a compelling comparison to big tobacco.

Lawmakers need to act, and to get social media regulation right

It's no surprise lawmakers are looking for ways to respond. Across the country, proposals in California, Texas, Utah and Alabama like the Age Appropriate Design Code and the App Store Accountability Act aim to address harms to children online. While the motivation is understandable, many of these bills cast too wide a net and as a result, risk creating new problems while trying to solve existing ones.

Instead of narrowly targeting the design features and business models driving these harms, they often sweep in the broader digital ecosystem. Thus the regulation doesn't just impact social media platforms, but also news organizations, educational services, and nonprofits. That raises serious First Amendment concerns. Laws that affect speech must be narrowly tailored, and courts have already shown skepticism toward broad, vague attempts to regulate online content.

Measures that require broad content restrictions or impose vague compliance obligations on publishers are also particularly vulnerable to legal challenges. And, if struck down, these efforts could further delay meaningful progress in addressing the real harms associated with social media.

A complex ecosystem requires precise solutions

There's a better path: Instead of regulating the entire internet, lawmakers should focus their attention on large social media platforms whose business models depend heavily on algorithmic amplification of user-generated content. These companies derive enormous revenue from engagement-driven advertising models that reward keeping users on their platforms for as long as possible.

Policies aimed at limiting manipulative design features for minors, increasing transparency around algorithms, and establishing reasonable duties of care could address these issues without sweeping in good faith actors.

Congress has already begun exploring this more targeted approach. The Kids Online Safety Act (KOSA), which has attracted overwhelming bipartisan support, focuses specifically on platforms that rely predominantly on user-generated content and algorithmic recommendation systems. By concentrating on the companies whose products create the greatest risks for young users, KOSA offers a more precise model for addressing online safety concerns.

Overbroad regulation and unintended consequences

That narrower focus is critical not only for constitutional durability but also for avoiding unintended consequences.

If new legislation imposes sweeping compliance obligations-such as complex age verification systems, extensive data governance requirements, or new liability frameworks-many news organizations could struggle to meet them. Smaller publishers in particular lack the legal and technical resources required to implement costly regulatory regimes designed with massive social media companies in mind.

Possibly even more troubling, some proposals could inadvertently restrict teenagers' access to credible, fact-checked journalism. If platforms respond to regulatory risk by broadly limiting content for minors, young people could find themselves cut off from some of the most reliable sources of information available online. Teenagers (and everyone else) need more access to reliable journalism from publishers who take responsibility for it. Policies designed to protect young people should not inadvertently make it harder for them to find credible reporting.

Mounting evidence of social media's impact on youth mental health demands a serious policy response. But effective regulation must be precise. Broad, legislation may seem decisive, but it risks violating constitutional protections, burdening responsible publishers, and limiting access to reliable information.

A more focused approach that targets the design practices and business incentives of the largest platforms offers a better path forward. It should hold the most powerful platforms accountable for the environment they create and the choices they make about how young users interact with their services. If policymakers maintain that focus, they can address a generational public health challenge while preserving an open and diverse online ecosystem.

DCN - Digital Content Next published this content on March 19, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on March 19, 2026 at 12:39 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]