Better by Design: What the Tech Liability Rulings Could Mean for All of Us
Recent landmark rulings found major tech platforms legally responsible for harms experienced by young users — specifically, harms tied to how their products were designed. Courts determined that the architecture of these platforms, not just the content on them, contributed to harm.
This is significant. And it's worth sitting with before jumping to conclusions about what it means.
The rulings focus on harms to young users, and that's where the concern is most acute. But the design practices courts identified are not age-specific.
What the courts identified is a design problem. Design problems have design solutions.
The design practices at issue — systems engineered to maximize engagement, features that amplify emotional content, algorithmic structures that keep users scrolling — don't stop working on us when we turn 16, or 18, or any other age. Adults are not immune to these forces. We get pulled in, nudged, distracted, and shaped by the same systems. Young people may be more susceptible to some of these effects — developmentally, that's plausible and worth taking seriously. But they are not the only ones affected, and framing this exclusively as a children's safety issue obscures something important about what these platforms actually do.
Age bans and access restrictions address the question of who can use these platforms. They don't address how the platforms work. A 15-year-old blocked from a social media app and an adult with full access are both living in an information environment shaped by the same engagement-maximizing logic — in their news feeds, their video recommendations, their messaging apps. The problem doesn't disappear at a particular age, or by moving to a different platform built on the same logic.
This is why I think the more interesting question isn't who should be allowed on these platforms, but what we should expect the platforms to do differently. Algorithmic transparency. Limits on manipulative features. Design choices that prioritize wellbeing over engagement time. These changes would protect young people — and the rest of us too.
The rulings establish that companies can be held responsible for the environments they build. That's a meaningful shift. But accountability only leads somewhere useful if we're clear about what we're asking for. Protecting kids by restricting access is one answer. Demanding that the digital environment itself be built better is a different one — and a more durable one.
We spend a lot of energy telling kids how to navigate the internet. Less energy insisting that the internet be worth navigating.
This might be the moment to change that.