Personal responsibility is failing. A youth-led coalition, Design It For Us, is abandoning "digital minimalism" in favor of structural platform regulation.
The personal responsibility approach—quitting cold turkey, digital minimalism—is failing. The social cost of leaving is too high (ostracization), so kids inevitably redownload. The burden is currently on teenagers and parents to fight supercomputers and thousands of engineers whose sole job is to capture their attention.
The last federal law regarding kids and the internet was COPPA in 1998 (before modern social media existed). That law is the only reason "13" is the standard age requirement, as it requires parental consent to collect data under 13. The legal framework is fundamentally pre-algorithmic.
We don't tell a parent to figure out if a car is safe before letting their kid drive it; the government establishes baseline safety standards for the manufacturer. The coalition argues the exact same duty of care needs to exist for digital environments, rather than pushing safety down to the end-user.
There is a stark contrast between what tech executives build and what they consume. Founders who place strict limits on their own children's screen time, or hide their faces online, continually refuse to implement the same baseline protections for the public infrastructure they operate.
There is an experience inversion: older lawmakers and millennial tech founders did not grow up with algorithmic feeds in their pockets during their most vulnerable developmental phases. Gen Z holds the actual domain expertise on the psychological damage of these products.
The coalition explicitly opposes bills that mandate age verification or give parents more surveillance tools. Instead, they support structural architecture changes for minors: