Holding Big Tech to Account to Keep Children Safe Online
- 3 days ago
- 2 min read
Over recent weeks, I’ve been contacted by hundreds of parents, carers, and teachers across Newbury and West Berkshire, all raising the same concern: social media is failing our children.
People have written to me about harmful content, relentless algorithms, lost sleep, anxiety, and the constant pressure on young people to stay online. I want to thank everyone who has taken the time to share their experiences with me. Your voices are shaping this conversation, and they matter.
As a parent myself, I share these concerns. For far too long, big tech companies have treated children as data to be mined rather than young people to be protected. Platforms have been deliberately designed to be addictive, encouraging endless doom-scrolling at the expense of children’s mental health, wellbeing, and development.
Many parents in Newbury and West Berkshire have also told me they worry that blanket bans would simply push teenagers onto less secure platforms, beyond the reach of safeguards and regulators. That’s not protecting children, it’s driving the problem underground. That’s why a harm-based age-rating system is so important: it tackles the risk at its source.
The Liberal Democrats are calling for a film-style age rating system for social media platforms, mirroring the protections we already trust offline. Platforms would default to a 16+ rating, move to 13+ where genuine safety reforms are made, or face an 18+ rating where extreme content is allowed to spread.
This isn’t about policing every post. It’s about holding platforms to account for how they are designed and how seriously they take child safety. Crucially, our approach would give Ofcom the powers it currently lacks, including the ability to issue “business disruption orders” so it becomes more costly for tech giants to break the rules than to follow them.
This approach also avoids sweeping up vital resources. Children would still be able to access services like Wikipedia for schoolwork, crisis support such as Childline, and safe ways to stay connected with family and friends.
It’s also why I’ve made the decision to step away from X. I can no longer in good conscience use or promote a platform that has repeatedly failed to act decisively on deeply harmful content involving children. Stepping back isn’t about leaving the debate, it’s about refusing to be complicit in a system that prioritises profit over safety.
The Liberal Democrats were the only party to call for the immediate suspension of X in the UK while Ofcom investigates, and for the National Crime Agency to urgently examine potential criminal activity. While action has finally been taken under pressure, accountability must go further.
The message from Newbury and West Berkshire is clear, and it’s one I fully share: if a platform spreads harmful content or relies on addictive, damaging algorithms, it should not be allowed anywhere near our children.
Just as we set clear standards to protect children offline, we must now do the same online, and make sure those rules are properly enforced. I will continue to press the government to act, and to listen to families locally who are rightly demanding better.

_edited.png)

