Public Policy & Mental Health

Who Protects Children in Algorithm-Driven Platforms?

By Charu Bigamudra January 16, 2026 6 min read

2 am. A child's cellphone is being pestered by an app. Not from a friend, nor regarding a family crisis, rather from the algorithm. Another clip, another alert, and another "I think you'll like this" message arrives. By the time morning comes, the youth will have experienced sleep deprivation, driven by anxious scrolling.

This is not merely a matter of screen time; it is a matter of systems that are structured solely to engage users with no regard for development. Legal professionals now refer to this phenomenon as "algorithmic parenting."

The Mechanism of Engagement

Social media platforms do not merely display material; they dictate what children experience, when they see it, and how frequently they view it. Endless scrolling feeds, dopamine-producing notifications, and algorithm-generated recommendations are continually delivering content about eating disorders, self-harm, and anxiety to children who already suffer from these conditions.

Profit Over Profiling: The design characteristics of an algorithm that supports a platform should never allow a corporation to profit from profiling children.

A Crisis of Responsibility

The government doesn’t know who is responsible for safety. A WHO study of 42 countries found that many governments treat safety as an issue of the family, rather than the corporation. Meanwhile, young people are typically left out of the discussion about their own well-being.

"We need policies that actually fit young people’s lives. Without including input from young people, regulations will be ineffective."
— Dr. Natasha Azzopardi-Muscat, WHO/Europe.
[Image showing the current global regulatory landscape for online child safety and age verification laws]

A New Regulatory Standard

WHO has called upon governments to reverse the burden of proof. Rather than requiring platforms to show that harm exists before regulations are put into place, platforms should be required to demonstrate that they provide safe environments for users.

The Vehicle Analogy

“Social media is like a car,” stated Romanian Youth Advocate Catalina Popoviciu. “It can bring individuals together, but the use of a car comes with seatbelts and speed limits.”

When governments abdicate their authority to act, children continue to be raised in an environment where they are tracked 24 hours a day, often without any concern for their safety or welfare. This isn’t just a tech issue; it is a widespread responsibility issue.