It’s Kids All the Way Down
Newly unsealed court filings, as reported by TIME, say the quiet part out loud: Meta knew. Not suspected—knew.
Knew sex trafficking flourished on its platforms—and tolerated a “17×” strike policy that let accounts rack up sixteen violations before deletion. Knew child sexual abuse reports were triaged below spam or IP disputes. Knew employees flagged harms and executives chose growth. This isn’t a company failing to protect kids. It’s a company protecting its business model from accountability.
This pattern doesn’t stop at Meta. YouTube. TikTok. Snapchat. The playbook is stable: deny the under-13 audience exists, avoid the obligations that follow, harvest the attention.
A small story that isn’t small
This morning I was texting my brother about Raoul Malo’s death. He’d heard the news but didn’t really know The Mavericks. I sent him Paolo Nutini’s 2006 Jools Holland clip—still electric after all these years.
I wasn’t signed in. No history. Before I could close the tab, the next ad was erectile-dysfunction pills.
A twelve-year-old discovering the same clip would have seen the same ad. That kid is not an edge case. He’s the median user the recommendation stack is tuned for—and the one the legal strategy pretends isn’t there.
The ecosystem pattern
In No One Planned This, I argue that platforms publicly insist they serve adults while quietly optimizing into the kids who over-index on time spent. The labels are legal. The behavior is economic.
YouTube says it’s 13-and-up. That line isn’t a safety promise; it’s an operating model. Keep the 13+ label to dodge COPPA obligations, enjoy Section 230’s shield over what the recommendation engine serves, and never say out loud who’s actually there. The product optimizes into the traffic it reads like weather radar: after-school TV spikes, weekend mornings, charts full of under-10 creators, comment sections littered with “I’m 7” and “I’m 9.” The company keeps a straight face. The machine keeps paving the path.
The tell
If YouTube truly believed it served only 13-and-up, three realities would already exist:
-
Video-level classification at upload—not a channel toggle that treats Bluey and Deadpool the same.
-
Contextual identity on shared TVs—default to the youngest viewer present.
-
Quarterly safety audits districts can request—not PR PDFs nobody reads.
After the 2019 COPPA settlement—where YouTube paid a record $170 million—the company added a channel-level“Made for Kids” switch that forced mixed accounts (think Disney uploading preschool cartoons alongside Marvel trailers) to pick one label for everything. In 2025, Disney paid $10 million to New York State over mislabeled kid-directed videos on its YouTube channels—a downstream receipt for a design choice every parent feels when a “safe” queue suddenly isn’t.
That absence isn’t technical. It’s the business model doing its job: minimize cost and friction, maximize inventory and plausible deniability.
Why parents and educators feel it
Parents don’t experience policy. They experience drift: autoplay that slips; Shorts that don’t end; a cartoon start that lands somewhere no adult in the room would choose. Educators experience the finger hovering over the mute button because the sidebar suggested something they can’t endorse. Libraries feel procurement risk: public screens meant for learning with no guarantee the next suggestion stays inside the fence.
Recommendation systems don’t care why a path works—only that it works. Eight-to-twelve-year-olds are reliable: after school, long sessions, forgiving tastes, relentless curiosity. Those paths get paved. Everyone downstream pays for that efficiency: attention dysregulation at home, classroom chaos, constant negotiation with a product that refuses to admit who it serves.
Minimum viable honesty
Who benefits, who pays. Platforms monetize the predictability of kids’ watch time. Families and schools absorb the drift. Gains are concentrated. Harms are diffuse.
Why it persists. The shields are strong (230’s liability carve-out, COPPA’s bright line). The 13+ label is convenient. Doing it right would add cost, add friction, shrink inventory—and, above all, require admitting under-13 usage at meaningful scale.
What this piece does. Names the pattern and stops there. The solutions live in the work that comes next.
The machine already knows exactly how old your child is.
It just isn’t allowed to admit it.
No One Planned This: How Platforms Rewired Entertainment—out February 3, 2026.
