Social media executives from Meta, Snap, YouTube, TikTok and X are called upon to Downing Street on Thursday for a crucial meeting with Prime Minister Sir Keir Starmer and Technology Secretary Liz Kendall over online safety for children. The tech bosses will face questioning about the steps they are implementing to safeguard young people and respond to parent worries, as the government pursues its consultation on whether to implement a complete prohibition on social media for under-16s, in line with Australia’s approach. Sir Keir has emphasised that the meeting will centre on ensuring “social media companies step up and take responsibility”, warning that “the consequences of failing to act are severe” and that the government has a duty to parents and the next generation to put children’s safety first.
The Number 10 Confrontation
Thursday’s gathering constitutes a critical moment in the government’s push to hold tech giants to account for their role in protecting vulnerable young users. The meeting comes at a pivotal juncture, with Parliament having rejected calls for an outright ban on social media for under-16s just hours earlier, despite backing from the House of Lords. Instead of implementing a blanket prohibition, MPs chose to grant ministers powers to introduce their own restrictions, indicating the government’s inclination for a increasingly bespoke regulatory approach rather than a sweeping legislative ban.
The timing of the Downing Street summit highlights the administration’s resolve to seem decisive on online safety whilst addressing multifaceted political and commercial pressures. Professor Gina Neff from the University of Cambridge’s Minderby Centre for Technology and Democracy noted the meeting allows the government to illustrate it is taking action on digital harms. Downing Street has previously acknowledged that some services have progressed, deploying actions such as disabling autoplay for children by standard, and offering parents improved oversight over device usage, though critics maintain considerably more must be achieved.
- Tech leaders interrogated about protections for children and responses to parental concerns
- The government considering restrictions on social media for those under 16 drawing from Australia’s example
- MPs rejected outright ban but gave ministers authority to introduce restrictions
- Some companies already introduced protections like turning off autoplay for children
Parliamentary Rejection and the Wider Discussion
Wednesday evening’s House vote proved damaging to supporters of a comprehensive social media ban for those under 16, representing the second time MPs have dismissed such measures despite strong support from the House of Lords. The administration’s choice to favour ministerial flexibility over legislative action reflects a more conservative strategy, with ministers arguing that an outright ban would be premature given continuing policy discussions. This strategy provides the government room for manoeuvre in designing tailored controls rather than introducing a sweeping ban that some fear could prove difficult to enforce and effectively oversee across various platforms.
The rejection has amplified discourse on whether the UK is adequately protecting its youth from internet-based threats. Whilst the authorities contend that granting ministers powers to implement bespoke guidelines represents a more sensible solution, critics argue this approach lacks the decisive action the situation requires. Recent research from Australia, where an ban on social media for under-16s was implemented in December 2025, reveals that more than 60 per cent of underage users persist in using platforms even so, prompting significant concerns about the efficacy of legal prohibitions and suggesting the challenge goes well beyond basic restrictions.
Cross-Party Criticism
The parliamentary ruling has drawn sharp opposition from opposition benches. Conservative shadow education secretary Laura Trott charged Labour MPs of letting down parents and children by rejecting the ban, arguing that other nations are acknowledging social media’s harms whilst the UK falls behind under the current government. Liberal Democrat education spokeswoman Munira Wilson echoed these worries, asserting that “the time for half-measures is over” and calling for immediate action to restrict the most destructive platforms for young users rather than incremental regulatory adjustments.
Australia’s Warning Story
Australia’s track record with online platform restrictions offers a cautionary case study for policymakers considering similar measures in the UK. When the country implemented a prohibition on social media for those under 16 in December 2025, it was hailed as a landmark step in safeguarding young users from digital risks. However, emerging research from the Molly Rose Foundation has uncovered a troubling picture: more than 60 per cent of young Australians continue using social media platforms despite the legal ban. This substantial non-compliance rate indicates that legislative bans alone could be insufficient in stopping determined young users from using the platforms they wish to use.
The Australian findings carry significant implications for the UK’s continuing policy deliberations. If a similar ban were implemented in Britain, the evidence suggests enforcement would present formidable challenges, with young people probably discovering methods to circumvent age-verification systems and restrictions through various technical means. The data challenges arguments that a simple legislative prohibition represents a silver-bullet solution to online safety concerns, instead highlighting the need for a more holistic approach integrating regulatory frameworks, platform accountability, parental oversight tools, and digital literacy education to meaningfully address the risks young people face online.
| Key Finding | Implication |
|---|---|
| Over 60% of underage Australians still access social media despite ban | Legislative prohibitions alone cannot effectively prevent determined young users from accessing platforms |
| Ban introduced in December 2025 has failed to achieve widespread compliance | Enforcement mechanisms remain weak and young people find workarounds to restrictions |
| Blanket bans do not address underlying appeal of social media to young people | Multi-faceted approach combining regulation, platform accountability, and education is necessary |
Leading Specialists Call for Concrete Steps
Child safety advocates and online protection specialists have stepped up demands for tech companies to implement meaningful action past self-regulation. The Molly Rose Foundation, created to honour 14-year-old Molly Russell who took her own life after accessing dangerous material on the internet, has been particularly vocal in demanding systemic change. Rather than implementing sweeping prohibitions that prove difficult to enforce, campaigners argue the priority should move towards making companies responsible for the algorithms that promote harmful content to at-risk individuals.
Andy Burrows, head of the Molly Rose Foundation, has stressed that Thursday’s Downing Street meeting constitutes a pivotal juncture for government action. The charity has repeatedly maintained that platforms possess the technological means to implement robust safeguards, yet often prioritise user engagement figures over the welfare of users. Experts emphasise that genuine protection requires platforms to overhaul their algorithmic recommendations, enhance content moderation, and offer parents with meaningful tools to track their children’s online activity successfully.
The Algorithm Issue
At the centre of concerns lies the algorithmic systems that control what content young users see. These algorithms are engineered to maximise engagement, often promoting sensational, harmful, or addictive content to at-risk groups. Overhauling these mechanisms represents one of the most critical issues in digital safety, requiring platform transparency about how their recommendation engines operate and what protective measures are in place.
- Algorithms prioritise engagement over user wellbeing and safety
- Platforms must increase disclosure of algorithmic recommendation processes
- External reviews of algorithmic harm are crucial for accountability
What Happens Next
Thursday’s summit at Downing Street will establish the tone for the government’s position regarding online child safety in the months ahead. Following the meeting, Sir Keir Starmer and Liz Kendall are anticipated to outline their results and determine whether established voluntary arrangements from tech companies suffice or whether enhanced statutory intervention becomes necessary. The government remains partway through its public engagement exercise on whether to establish an Australia-style ban on social media for under-16s, with the conclusions from this week’s talks likely to affect the final policy direction.
Ministers have indicated a preference towards giving themselves powers to introduce constraints rather than enacting an all-out ban, citing anxieties over practical implementation and results. However, mounting pressure from opposition parties, child safety advocates, and parents suggests the government may come under sustained pressure for more decisive action. The weeks ahead will be pivotal in determining whether digital platforms can show real commitment to keeping young users safe or whether Westminster will introduce new laws to enforce compliance with tougher safety requirements.