Reclaiming Childhood in the Digital Age: Why a Public Health Approach to Social Media Matters
March 12, 2026
A new report from the Centre for Young Lives has issued a clear warning to policymakers: the debate about protecting children online risks becoming stuck in the wrong argument. With MPs considering whether access to harmful social media platforms should be restricted to over 16s, the report argues that the national conversation has been framed…
Share this:
A new report from the Centre for Young Lives has issued a clear warning to policymakers: the debate about protecting children online risks becoming stuck in the wrong argument.
With MPs considering whether access to harmful social media platforms should be restricted to over 16s, the report argues that the national conversation has been framed as a false choice between raising age limits and improving platform design. In reality, both are urgently needed.
The report, Reclaiming Childhood in the Digital Age: A framework for regulating social media platforms, calls for a comprehensive package of protections that treats digital harms as a public health issue rather than a matter of individual responsibility.
With 96% of 13–17-year-olds now using social media, the scale of exposure is unprecedented. At the same time, concerns about the impact on young people’s wellbeing, sleep, attention, and mental health are increasingly raised by parents, teachers, clinicians, and young people themselves.
The question facing policymakers is no longer whether harms exist, but how society should respond.
A debate stuck in the wrong place
Much of the current discussion has focused on whether raising the minimum age for social media platforms to 16 is the right approach.
The Centre for Young Lives argues that this framing risks missing the bigger picture.
An age limit alone cannot solve every problem. But nor can improvements to platform design on their own address the scale of exposure that currently exists.
Instead, the report argues that age limits and stronger platform regulation should be seen as complementary measures within a broader framework of protection.
In this context, raising the age limit to 16 is described as a “harm-pausing” intervention. It would not eliminate risk entirely, nor prevent every child from finding ways around restrictions. But it could significantly reduce exposure while deeper reforms to platform design and regulation are implemented.
A rapidly evolving digital environment
One of the central challenges highlighted in the report is the speed at which social media platforms evolve compared with the pace of research and policy.
Digital environments change rapidly, while the evidence base on harms takes time to develop. Meanwhile, the most detailed data on user behaviour and platform design remains largely in the hands of technology companies themselves.
This creates an uneven playing field for policymakers and researchers attempting to assess risk.
The report describes this situation as a “burning platform”, where harm is being experienced in real time while the systems designed to evaluate and regulate those harms struggle to keep pace.
Waiting for perfect evidence in this context, the authors argue, risks embedding delay and exposing another generation of children to avoidable harm.
A public health approach to digital harms
To address this imbalance, the report calls for a shift in how society understands and regulates social media.
Rather than treating online harms as issues that can be solved through parental monitoring, digital literacy, or individual behaviour change alone, the report recommends adopting a public health model.
This approach recognises that when exposure is widespread across a population, responsibility cannot rest solely with families.
Parents are navigating an environment where social media platforms are designed to capture attention, maximise engagement, and retain users for as long as possible. In such an environment, opting out is often neither realistic nor socially easy for young people.
A public health approach therefore places greater responsibility on the systems themselves. The digital environment must meet minimum safety standards so that families are not left to compensate for unsafe defaults.
Shifting the burden of proof
A central proposal in the report is the introduction of a precautionary regulatory approach.
In many areas of consumer safety, companies must demonstrate that their products are safe before they can be widely marketed. Toys, medicines, and vehicles all operate within regulatory frameworks that place safety obligations on manufacturers.
The report argues that social media platforms should be treated in a similar way.
Instead of researchers having to prove that platforms are harmful, the burden of proof should shift to technology companies to demonstrate that their products are safe for children to use.
This would represent a significant change in how digital technologies are regulated, recognising the scale of influence these platforms now have in children’s lives.
A wider vision for childhood
Importantly, the report does not focus solely on restrictions.
It also highlights the need to rebuild the wider environments that support children’s development and wellbeing.
This includes investment in safe outdoor spaces, youth services, arts and sports opportunities, and child-friendly neighbourhood design. The report proposes a Play and Recreation Levy on major social media platforms to help fund these alternatives.
The aim is not simply to limit exposure to harmful digital environments, but to ensure that children have meaningful opportunities for connection, play, creativity, and development both offline and online.
A moment for policy leadership
The report was published ahead of the recent vote in Parliament on proposals to raise the minimum age for harmful social media platforms to 16. While debate continues about the most effective regulatory approach, the report highlights an important point: meaningful protection for children is unlikely to come from any single measure. Age limits, stronger platform regulation, and investment in children’s environments should be seen as complementary parts of a wider framework.
For health professionals, educators and those working with children every day, the patterns are increasingly familiar. Concerns about sleep, attention, social development and mental wellbeing are appearing across different settings and age groups.
These experiences do not provide all the answers, but they offer important signals about where risks may lie.
The decisions now facing policymakers are therefore not only about technology regulation. They are about how society chooses to protect childhood in an environment where digital platforms have become central to daily life.
Listening to those who work most closely with children, including clinicians, teachers, and families themselves, will be essential in shaping solutions that are both effective and realistic.
Reclaiming childhood in the digital age will require thoughtful regulation, responsible innovation, and a shared commitment to putting children’s wellbeing at the centre of policy decisions.