All revisions

Revision #1

System

about 5 hours ago

Musk, Restore Britain, and the Fight Over the Online Safety Act: Who Really Benefits From Repeal?

In January 2026, Elon Musk posted a message to his 200-million-plus following on X: "Join Rupert Lowe in Restore Britain, because he is the only one who will actually do it!" [1]. The "it" in question is an ambitious legislative agenda that includes repealing the UK's Online Safety Act 2023 — the most significant piece of internet regulation in British history. Musk's endorsement gave oxygen to a movement that has since gathered over 500,000 petition signatures and converted from a grassroots pressure group into a registered political party [2].

But the campaign to scrap the law sits uneasily with the evidence. Polling shows the British public overwhelmingly supports the Act. Child safety organisations say the harms it targets are worsening. And Musk's own platform, X, is under formal investigation by Ofcom for alleged failures to comply with the very law he wants abolished [3].

This is a story about who gets to define online safety — and who pays the price when the rules disappear.

What the Online Safety Act Actually Does

The Online Safety Act received Royal Assent on 26 October 2023 after years of parliamentary debate [4]. It imposes legal duties on platforms to protect users — particularly children — from illegal content including child sexual abuse material (CSAM), terrorist content, and non-consensual intimate imagery. It also requires platforms to conduct risk assessments, implement age verification, and take proactive steps to prevent harm [5].

Ofcom, the UK's communications regulator, was given enforcement powers including fines of up to £18 million or 10% of a company's qualifying worldwide revenue, whichever is greater [3]. The first enforcement codes became legally binding in March 2025, and by October that year Ofcom had launched five enforcement programmes and opened 21 investigations [6].

The Act's first financial penalty came against 4chan — a £20,000 fine plus £100 per day for continued non-compliance [7]. Small by Big Tech standards, but a signal of intent.

The Cost of Building — and Potentially Dismantling — the Regulatory Machine

Ofcom's online safety budget has grown rapidly. From £22 million in the 2021/22 financial year — when the regulator was preparing for a law still being debated in Parliament — spending rose to £92 million in 2025/26 [8].

Ofcom Online Safety Budget (£ millions)
Source: Ofcom FOI / Annual Reports
Data as of Mar 1, 2026CSV

Under the Act's design, this cost is borne not by taxpayers but by the tech industry itself. Companies with qualifying worldwide revenue of £250 million or more pay fees set at roughly 0.02% to 0.03% of that revenue — calibrated to cover Ofcom's costs without exceeding them [9]. The government has described the regime as "cost neutral to the Exchequer" [10].

Unwinding this infrastructure would not be straightforward. Ofcom has hired specialist staff, built compliance frameworks, issued codes of practice, and entered into enforcement proceedings with dozens of platforms. Any repeal legislation would need to address live investigations, existing compliance obligations already imposed on companies, and the question of whether fees already collected would be refunded. No detailed cost estimate for repeal has been published by any party advocating it.

Musk's Platform Under the Spotlight

Musk's opposition to the Online Safety Act is not purely ideological. His platform, X, faces a formal Ofcom investigation opened in early 2026 over whether it complied with its legal duties following the Grok AI chatbot scandal [3]. Ofcom's probe centres on reports that X's AI tool was used to generate sexualised imagery — including potential CSAM and non-consensual intimate images — raising questions about whether X conducted adequate risk assessments before deploying the feature and whether it took sufficient steps to prevent UK users from encountering illegal content [11].

X has been required to age-restrict certain content to comply with the new rules, and the company has publicly criticised what it calls Ofcom's "heavy-handed approach" and "layers of bureaucratic oversight" [12]. X issued a formal statement arguing the Act risks "seriously infringing" free speech rights, though it acknowledged the law's "laudable" goal of protecting children [13].

If found in breach, X faces potential fines of up to 10% of its global revenue. Musk therefore has a direct financial interest in the law's repeal — a fact that child safety advocates have been quick to highlight [3].

Restore Britain: From Pressure Group to Political Party

Restore Britain was launched on 30 June 2025 as a political movement by Rupert Lowe, the MP for Great Yarmouth [2]. Lowe was originally elected for Reform UK in 2024 but left after a public dispute with the party leadership. Reform UK suspended him in March 2025 following allegations — later dropped by the Crown Prosecution Service for insufficient evidence — related to a confrontation with party chairman Zia Yusuf [14].

By February 2026, Restore Britain had registered as a political party and claimed 70,000 members [2]. Its platform extends well beyond tech regulation: it calls for net-negative immigration, a referendum on the death penalty, withdrawal of BBC public funding, and what it terms a "Great Repeal Act" by 2029 to roll back laws its leadership considers overreaching [15].

On funding, Restore Britain describes itself as "funded by its membership and other donations" [16]. It operates a high-donor tier called the Cromwell Club. However, detailed donor disclosures are not publicly available, and the organisation has not published information identifying ties to US tech-industry lobbying groups. HOPE not hate, the anti-extremism campaign group, has flagged concerns about the party's funding transparency and its positioning on the far right of British politics [17]. Various journalists and commentators have described the party as right-wing or hard-right [18].

Musk's endorsement — while significant in amplifying the movement — does not appear to constitute a formal financial contribution. Whether his backing represents a personal conviction about free speech or a strategic interest in reducing regulatory exposure for X is a question the available evidence does not definitively resolve.

The Child Safety Case for the Act

The organisations that helped shape the Online Safety Act include the NSPCC, the Internet Watch Foundation (IWF), and various domestic abuse and counter-terrorism bodies that participated in consultations during the Bill's passage through Parliament [19].

Their central argument is that the harms the Act targets are growing, not receding. The IWF acted on 291,270 webpages containing child sexual abuse imagery in 2024 — a 5% increase on the previous year and the highest figure the organisation has recorded [20]. AI-generated CSAM reached a record high in 2025, with over 8,000 cases identified by the IWF [21]. Nearly 1,900 UK children reported sexual imagery concerns in 2025, a 66% increase, with more than 1,100 confirmed cases involving abuse material [22].

IWF Child Sexual Abuse Webpages Actioned
Source: Internet Watch Foundation Annual Reports
Data as of Mar 1, 2026CSV

The NSPCC has argued that children's online safety is not just a moral imperative but an economic one, with research by the charity and Baringa estimating that UK businesses could unlock up to £3 billion in revenue by prioritising children's safety online [19]. The IWF's chair wrote to the Prime Minister in January 2025 urging the government to strengthen, not weaken, online safety regulation [20].

For these organisations, repeal is not a theoretical policy debate — it is a direct threat to children. They point to the Act's specific requirements around risk assessment and proactive content removal as precisely the mechanisms needed to force platforms to act rather than react.

The Free Speech Case Against the Act

The civil liberties case against the Online Safety Act is substantive, and it predates Musk's involvement by years. Organisations including Big Brother Watch, the Electronic Frontier Foundation (EFF), Open Rights Group, and Index on Censorship jointly called on the UK government in December 2025 to reform or repeal the Act [23].

Their concerns centre on several specific provisions:

Clause 122 and encrypted messaging. The Act grants the government power to require platforms to scan encrypted messages for CSAM using "client-side scanning" technology. The government has acknowledged that the necessary technology does not yet exist in a form that preserves privacy, but reserves the right to mandate its use once developed [24]. Privacy advocates argue this amounts to a backdoor into encrypted communications — a position supported by Signal and WhatsApp, which have threatened to withdraw from the UK market rather than comply [24].

Section 179 and "false communications." The Act creates a criminal offence for sending messages containing false information intended to cause "non-trivial psychological or physical harm." Legal scholars at the Constitution Society have argued this provision could "prohibit the public from candidly discussing some of the most pressing policy issues" and has a chilling effect on lawful speech [24]. News organisations are statutorily exempt; individual social media users are not.

Platform categorisation by size, not risk. The Act categorises platforms by user numbers rather than harm potential, meaning smaller but high-risk sites — such as forums promoting suicide or self-harm — may fall into less restrictive tiers [24].

Academic analysis published in the Journal of Media Law has identified a "disconnect between the legislation and the legal and theoretical principles underpinning free speech" under both UK law and Article 10 of the European Convention on Human Rights [25]. The government did remove provisions targeting "legal but harmful" content in November 2022 after parliamentary pressure, but critics argue the remaining framework still gives the state excessive power to define and police online speech [26].

Crucially, many of these critics do not call for outright repeal. The EFF and its allies have called for "reform or repeal" — suggesting that targeted amendments to the most problematic provisions could address civil liberties concerns without abandoning child protection [23].

How the UK Compares to Other Jurisdictions

The Online Safety Act sits alongside two other major platform safety regimes: the EU's Digital Services Act (DSA) and Australia's Online Safety Act, enforced by the eSafety Commissioner.

The UK approach is more prescriptive than the EU's. Where the DSA takes a "layered, risk-based approach" calibrated to platform scale and focuses on notice-and-takedown procedures, the OSA emphasises proactive monitoring and imposes specific duties around illegal content categories [27]. The UK's maximum penalty — 10% of global turnover — exceeds the EU's 6% cap [28].

Australia's regime grants its eSafety Commissioner direct takedown powers and mandatory safety expectations, but with a narrower scope focused on cyberbullying, image-based abuse, and specific harmful content categories [27].

Musk and X have clashed with regulators in all three jurisdictions. The European Commission opened proceedings against X under the DSA in late 2023, and Australia's eSafety Commissioner has pursued enforcement actions against the platform [27]. X has not publicly endorsed any of these regulatory frameworks.

No democratic government has successfully repealed a major platform safety law after enactment. The closest analogue is the US experience with the Fight Online Sex Trafficking Act (FOSTA) of 2018, which has been widely criticised for unintended consequences — including increased censorship of legal speech and reduced ability for law enforcement to locate trafficking victims — but has not been repealed [29]. The absence of precedent means there is no empirical basis for predicting the effects of repealing the Online Safety Act on harmful content prevalence.

What the Public Actually Thinks

Polling conducted by Opinium in late November and early December 2025 found that 70% of UK respondents supported the Online Safety Act, compared to 12% who opposed it [30]. Support was consistent across all major political parties — including, notably, among Reform UK supporters, who backed the Act by a margin of 60% to 22% [30].

UK Public Opinion on the Online Safety Act

Separate research by More in Common found parental support running even higher: 78% among Labour voters, 76% among Liberal Democrats, 72% among Conservatives, 71% among Greens, and 61% among Reform UK voters [30].

Where the public does express frustration, it is directed at the pace of implementation, not the principle. Forty-nine percent of respondents said the Act was being implemented too slowly, compared to 10% who thought it was too fast. Sixty-two percent were concerned about Ofcom's decision to delay user verification requirements [30].

The 500,000-plus petition signatures calling for repeal — while significant — represent a fraction of the UK electorate and, according to the polling, are "significantly out of step with public opinion" [30]. When a parliamentary debate on the petition took place in December 2025, MPs rejected the call for repeal [31].

The Reform-Not-Repeal Middle Ground

The debate has increasingly coalesced around a third option: neither full repeal nor the status quo, but targeted reform.

Civil liberties groups want Clause 122's encryption provisions removed or substantially constrained. Legal scholars want the false communications offence narrowed to prevent chilling effects on political speech. Platform companies want clearer, more proportionate compliance requirements [23] [24].

Child safety organisations, for their part, want the Act strengthened — not weakened. They point to Ofcom's decision to delay user verification duties and the slow rollout of children's safety codes as evidence that the current framework is, if anything, too permissive [30].

The UK government has signalled it intends to expand the Act's scope to cover AI-generated harms, including requiring AI chatbot providers to comply with illegal content duties — a direct response to the Grok controversy on X [32].

What Is Really at Stake

The campaign to repeal the Online Safety Act brings together genuinely held free speech concerns with the commercial interests of platform companies facing new compliance costs and enforcement risk. That these interests sometimes align does not mean they are identical.

Musk's endorsement of Restore Britain amplifies a movement whose policy agenda extends far beyond tech regulation. Whether his involvement helps or hinders the cause of genuine reform depends on whether the debate remains focused on specific, fixable provisions of the law — or collapses into a binary choice between total repeal and the status quo.

The evidence suggests the British public has already made a more nuanced judgment: they want the law to work, they want it to work faster, and they are not persuaded that scrapping it serves their interests. The challenge for policymakers is to listen to both the civil liberties concerns and the child safety evidence — and to resist the temptation to let the loudest voices on either side define the terms of a debate that affects everyone who uses the internet in the UK.

Sources (32)

  1. [1]
    Elon Musk endorses Restore Britain on Xx.com

    Musk posted: 'Join Rupert Lowe in Restore Britain, because he is the only one who will actually do it!'

  2. [2]
    Restore Britain - Wikipediaen.wikipedia.org

    Restore Britain launched 30 June 2025 as a movement, registered as a political party 13 February 2026, led by Rupert Lowe MP, claiming 70,000 members.

  3. [3]
    Ofcom investigation into X under the Online Safety Actofcom.org.uk

    Ofcom launched a formal investigation into X over Grok AI-generated sexualised imagery and potential CSAM, examining compliance with illegal content and child safety duties.

  4. [4]
    2025 UK Online Safety Act: Key Milestones and Future Stepscms-lawnow.com

    The Online Safety Act 2023 received Royal Assent on 26 October 2023; 2025 marked the first year of enforcement with illegal content and child safety codes becoming law.

  5. [5]
    The Online Safety Act Enters Phase 2mayerbrown.com

    Overview of OSA Phase 2 requirements including risk assessments, age verification, and proactive content removal duties for platforms.

  6. [6]
    Online Safety Act: Ofcom publishes enforcement updatedlapiper.com

    By October 2025, Ofcom had launched 5 enforcement programmes and opened 21 investigations under the Online Safety Act.

  7. [7]
    Enforcing the Online Safety Act: Ofcom's £20,000 Finepreiskel.com

    Ofcom issued its first OSA financial penalty — £20,000 against 4chan plus £100/day for non-compliance — signalling a broader enforcement approach.

  8. [8]
    Ofcom and the Online Safety Act: Funding and Contractsmedium.com

    Ofcom's online safety budget reached £92m in FY 2025/26, up from £71m in FY 2024/25 and £22m in FY 2021/22.

  9. [9]
    Paying the Price of Online Safety – Ofcom's Fee Regimenatlawreview.com

    Ofcom fees set at 0.02%-0.03% of qualifying worldwide revenue for companies exceeding the £250m QWR threshold.

  10. [10]
    Implementation of the Online Safety Act: Fees threshold - GOV.UKgov.uk

    Government guidance states companies raising revenue from online services should cover regulation costs, making the regime 'cost neutral to the Exchequer.'

  11. [11]
    Ofcom launches investigation into X over Grok sexualised imageryofcom.org.uk

    Investigation examines whether X conducted adequate risk assessments and took steps to prevent UK users encountering illegal content including CSAM.

  12. [12]
    Elon Musk's X Fires Shots at the UK's New Online Safety Acttechnologymagazine.com

    X criticised Ofcom's 'heavy-handed approach' and 'layers of bureaucratic oversight' while being required to age-restrict certain content.

  13. [13]
    Elon Musk's X warns free speech under threat due to UK's Online Safety Actgbnews.com

    X issued a statement saying the Act risks 'seriously infringing' free speech rights while acknowledging the 'laudable' goal of protecting children.

  14. [14]
    Rupert Lowe Launches Restore Britain Party After Reform UK Splitevrimagaci.org

    Lowe was suspended from Reform UK in March 2025; CPS dropped allegations in May 2025 for insufficient evidence.

  15. [15]
    Policies - Restore Britainrestorebritain.org.uk

    Platform includes net-negative immigration, death penalty referendum, BBC defunding, and a 'Great Repeal Act' by 2029.

  16. [16]
    Donate - Restore Britainrestorebritain.org.uk

    Restore Britain describes itself as funded by membership and donations; operates a high-donor Cromwell Club tier.

  17. [17]
    Rupert Lowe and Restore Britain: What You Need To Know – HOPE not hatehopenothate.org.uk

    HOPE not hate has flagged concerns about Restore Britain's funding transparency and far-right positioning in British politics.

  18. [18]
    Rupert Lowe launches new hard-right political partyleftfootforward.org

    Journalists and commentators have described Restore Britain as right-wing or hard-right in the context of British politics.

  19. [19]
    Children's online safety could provide £3 billion to UK businesses - NSPCCnspcc.org.uk

    NSPCC and Baringa research estimates UK businesses could unlock £3bn in revenue by prioritising children's online safety.

  20. [20]
    UK data reveals alarming growth in online child abuse casesdig.watch

    IWF acted on 291,270 CSAM webpages in 2024, a 5% increase; AI-generated CSAM hit record highs in 2025.

  21. [21]
    Online AI-generated child sexual abuse material increased in 2025care.org.uk

    IWF reported over 8,000 cases of AI-generated CSAM in 2025, a record high.

  22. [22]
    UK data reveals alarming growth in online child abuse casesdig.watch

    Nearly 1,900 UK children reported sexual imagery concerns in 2025, a 66% rise, with over 1,100 confirmed abuse cases.

  23. [23]
    EFF, Open Rights Group, Big Brother Watch, and Index on Censorship Call on UK Government to Reform or Repeal Online Safety Acteff.org

    Four civil liberties organisations jointly called for reform or repeal of the Online Safety Act in December 2025.

  24. [24]
    Online Safety Act: Privacy Threats and Free Speech Risks - The Constitution Societyconsoc.org.uk

    Legal analysis identifying Clause 122 (client-side scanning), Section 179 (false communications), and platform categorisation as key threats to civil liberties.

  25. [25]
    Tackling online false information in the UK: The Online Safety Act 2023 and its disconnection from free speech lawtandfonline.com

    Academic analysis identifying disconnect between the Act and legal/theoretical principles underpinning free speech under UK law and ECHR Article 10.

  26. [26]
    Five things you need to know about the Online Safety Act – Big Brother Watchbigbrotherwatch.org.uk

    Government removed 'legal but harmful' content provisions in November 2022 after parliamentary pressure over free speech concerns.

  27. [27]
    The Differences Between the Online Safety Act & the Digital Services Acttrustlab.com

    The OSA emphasises proactive monitoring while the DSA takes a layered, risk-based notice-and-takedown approach; UK penalties cap at 10% vs EU's 6%.

  28. [28]
    Effective enforcement of the OSA and DSA: unpacking compliance regimestandfonline.com

    Comparative academic analysis of UK and EU enforcement approaches to platform safety regulation.

  29. [29]
    Section 230 and FOSTA: Unintended Consequences of Platform Safety Lawpublicknowledge.org

    FOSTA (2018) led to increased censorship of legal speech and reduced law enforcement ability to locate trafficking victims — a cautionary precedent.

  30. [30]
    New polling reveals strong public support for the Online Safety Actonlinesafetyact.net

    Opinium polling (Nov-Dec 2025): 70% support the OSA vs 12% opposed; 49% say implementation too slow; support spans all parties including Reform UK (60%).

  31. [31]
    UK MPs debate evolution of Online Safety Act, reject repeal petitionbiometricupdate.com

    MPs debated the 500,000+ signature repeal petition in December 2025 and rejected the call for repeal.

  32. [32]
    UK seeks more powers to tackle AI harms in Online Safety Actopendemocracy.net

    The UK government is moving to update the OSA to require AI chatbot providers to comply with illegal content duties, responding to the Grok controversy.