UK Regulator Opens Investigation Into Telegram Over Child Abuse Material
TL;DR
The UK's communications regulator Ofcom has opened a formal investigation into Telegram over evidence of child sexual abuse material on the platform, testing the full power of the Online Safety Act for the first time against a major encrypted messaging service. The probe, triggered by intelligence from the Canadian Centre for Child Protection, carries penalties of up to 10% of Telegram's global revenue and could ultimately lead to the app being blocked in Britain.
On April 21, 2026, the UK's Office of Communications (Ofcom) announced a formal investigation into Telegram Messenger Inc. under Section 10 of the Online Safety Act 2023 . The probe examines whether Telegram has failed to comply with its legal duties to prevent the sharing of child sexual abuse material (CSAM) on the platform — duties that have been enforceable since March 17, 2025 . Alongside Telegram, Ofcom opened parallel investigations into two smaller services, Teen Chat and Chat Avenue, over concerns that predators were using their chatrooms to groom children .
The Telegram investigation is the most consequential test yet of the Online Safety Act's enforcement regime against a major encrypted messaging platform. Telegram has over one billion monthly active users worldwide , and the outcome will set precedents for how the UK regulates private communications at scale.
The Evidence and How It Surfaced
Ofcom's decision to open the investigation followed intelligence provided by the Canadian Centre for Child Protection (C3P), which flagged evidence of CSAM being shared on Telegram . Ofcom then conducted its own assessment of the platform before concluding that a formal investigation was warranted .
The regulator has not disclosed the specific volume of CSAM reports it has documented on Telegram. This lack of granularity stands in contrast to the reporting volumes available for other platforms. Meta's Facebook, Instagram, and Threads alone submitted over 5.2 million reports to the U.S. National Center for Missing & Exploited Children (NCMEC) CyberTipline in Q1 2024 and over 2 million in Q2 2025 . Across all platforms, NCMEC received 21.3 million reports containing more than 61.8 million files related to suspected child sexual exploitation in 2025 .
This reporting disparity does not necessarily reflect the actual prevalence of CSAM on each platform. Meta uses automated hash-matching systems — technology that compares uploaded images against databases of known CSAM — to detect and report material at scale. Telegram, until recently, lacked comparable detection infrastructure for much of its service. A platform that reports fewer cases may simply be detecting fewer, not hosting fewer.
The Internet Watch Foundation (IWF), the UK's primary CSAM watchdog, actioned 291,273 reports for removal in 2024, assessing a new report every 74 seconds . Of those reports where the sex of children was recorded, 97% depicted girls . The IWF has also flagged a surge in AI-generated CSAM, with 8,029 AI-generated images and videos assessed as showing realistic child sexual abuse in 2025 — including 3,443 AI-generated videos, a 26,385% increase over the 13 recorded in 2024 .
What the Law Requires — and What Penalties Apply
The Online Safety Act 2023, which received Royal Assent in October 2023, imposes a system of "illegal content duties" on platforms accessible to UK users. These duties, which came into force on March 17, 2025, require services to assess and mitigate the risk of users encountering priority illegal content — a category that includes CSAM — and to take rapid action to remove such material when it appears .
Ofcom's investigation specifically examines whether Telegram has breached these illegal content duties under Section 10 of the Act . The enforcement process begins with evidence gathering, followed by a provisional decision that gives the company an opportunity to respond before Ofcom issues a final determination .
The financial penalties are substantial. Ofcom can impose fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater . Telegram reported revenue of $1.4 billion in 2024 — its first profitable year, with net profit of $540 million . H1 2025 revenue reached $870 million, a 65% increase year-over-year, with the company projecting $2 billion in total revenue for 2025 . A 10% penalty based on 2024 revenue would amount to approximately $140 million — far exceeding the £18 million floor and dwarfing the £1.875 million fine Ofcom levied on TikTok in July 2024 for providing inaccurate data in response to a statutory information request .
The Encryption Question
Telegram's architecture is central to this investigation. The platform supports channels with up to 200,000 members and group chats that can host thousands of participants. A common assumption is that end-to-end encryption (E2EE) — where only the sender and recipient can read messages — prevents any form of content detection. But Telegram's actual encryption model is more nuanced.
Standard Telegram messages, including those in groups and channels, are not end-to-end encrypted . They use server-client encryption via Telegram's proprietary MTProto protocol, meaning Telegram's servers can, in principle, access message content. Only "Secret Chats," a feature limited to one-on-one conversations on mobile devices, use E2EE . This distinction matters: the bulk of CSAM sharing alleged by Ofcom likely occurs in groups and channels where Telegram does have technical access to content.
Security researchers have identified cryptographic weaknesses in MTProto, though Telegram has addressed specific vulnerabilities in updated versions . The broader academic consensus, reflected in a growing body of research — over 13,800 papers published on CSAM online detection since 2011, peaking at 2,131 in 2023 — is that hash-matching against databases of known CSAM remains the most reliable detection method. This technique does not require breaking encryption, as it works by comparing fingerprints of known illegal images against uploaded content.
Cryptographers and digital rights organizations have warned that alternative approaches, such as client-side scanning (where detection software runs on users' devices before encryption), risk undermining encryption guarantees and producing false positives . But for Telegram's non-E2EE messages — the majority of its traffic — server-side detection is technically feasible.
The NSPCC has argued that "there should be no part of the service where perpetrators can act without detection" . Telegram's position is that its IWF partnership addresses public content, while privacy protections in private messaging serve legitimate users including journalists and activists. Independent security researchers have not published a consensus assessment of whether Telegram's current measures exhaust what is technically possible without compromising encryption in Secret Chats.
After Durov's Arrest: What Changed
This investigation arrives less than two years after Telegram CEO Pavel Durov was arrested at Le Bourget Airport in Paris on August 24, 2024 . French authorities indicted him on twelve charges, including complicity in the distribution of child exploitation material and facilitating drug trafficking .
The arrest prompted measurable changes. Within days, Telegram updated its FAQ to explicitly allow users to report private chats to moderators — a feature the company had previously declined to offer . Telegram subsequently removed millions of groups and channels, began sharing user IP addresses and phone numbers in response to court orders, and in December 2024 joined the IWF, giving the charity's hash-matching technology access to detect known CSAM on the platform's public-facing content .
The IWF partnership marked a reversal. The charity had previously stated that Telegram "ignored outreach" from child safety watchdogs prior to Durov's arrest . But critics note the partnership's scope is limited: it covers only public portions of the platform, leaving activity in private groups — where much of the most harmful content circulates — unaddressed .
France lifted all travel restrictions on Durov in November 2025 after he complied with one year of judicial supervision, though the criminal investigation remains ongoing . No regulator, UK or otherwise, has published an independent verification that the post-arrest changes have measurably reduced CSAM on Telegram.
Telegram itself has contested the premise of the Ofcom investigation. In a statement responding to the probe, the company said: "Telegram categorically denies Ofcom's accusations. Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with NGOs" .
UK Reach and User Base
Approximately 10% of the UK population regularly uses Telegram, a figure that places it well below adoption rates in countries like India (45%), Brazil (38%), and Mexico (34%) . With the UK's population at roughly 67 million, that implies around 6.7 million UK users — approximately 0.67% of Telegram's global billion-user base.
Ofcom has not published demographic breakdowns of which UK user groups face the greatest CSAM exposure on Telegram. The IWF's 2024 data shows that 97% of actioned CSAM reports depicted girls , but this figure reflects content across all platforms, not Telegram specifically. Child protection charities including the NSPCC have called for platform-specific transparency on which age groups and demographics encounter harmful content, but such granular data remains unavailable for Telegram.
The Displacement Question
A recurring argument against aggressive enforcement is that shutting down or restricting a platform simply pushes harmful activity to less visible corners of the internet. After law enforcement takedowns of dark web marketplaces, new sites have reliably emerged. Operation RapTor in May 2025 seized $200 million and arrested 270 suspects, yet marketplaces quickly re-formed .
However, child safety organizations counter that this argument, while directionally accurate for some types of illicit commerce, overstates the ease of migration for CSAM networks. Dark web platforms require technical sophistication that many offenders lack. The IWF's work suggests that removal and disruption, even when imperfect, raises barriers to access and reduces the speed at which material circulates . Academic research on CSAM detection has expanded significantly over the past decade, with publications rising from 207 in 2011 to over 2,000 annually by 2023, reflecting sustained investment in countermeasures .
The empirical evidence is mixed. No peer-reviewed study has conclusively demonstrated that enforcement against mainstream platforms causes a proportional increase in dark web CSAM activity. But neither has any study shown that platform-level enforcement alone eliminates the underlying market. The most effective approaches documented in the literature combine platform enforcement with law enforcement operations targeting producers and distributors.
How Long Investigations Take — and What Else Ofcom Can Do
Ofcom's track record provides some guidance on timelines. Its investigation into TikTok over a statutory information request opened in December 2023 and resulted in a final decision with a £1.875 million fine in July 2024 — approximately seven months . But that case involved a relatively straightforward compliance failure. The Telegram investigation, which requires Ofcom to assess the adequacy of a platform's entire approach to illegal content, is likely to be more complex.
Since the Online Safety Act's illegal content duties took effect in March 2025, Ofcom has opened investigations into nearly 100 services and issued nearly a dozen fines . In January 2026, the regulator opened an investigation into X (formerly Twitter) after reports that its Grok AI chatbot was generating sexually explicit images of children . In March 2026, Ofcom wrote directly to six of the largest platforms — Facebook, Instagram, Roblox, Snapchat, TikTok, and YouTube — demanding evidence of child safety improvements by April 30, 2026 .
Beyond fines, the Online Safety Act grants Ofcom a toolkit that extends well beyond financial penalties. If direct enforcement fails, the regulator can apply to a court for "service restriction orders" — legal instruments that compel third parties to act against a non-compliant platform . These orders can require internet service providers to block user access, direct app stores to remove the application, prevent domain name registrars from resolving the service's address, and bar payment processors from facilitating transactions with the platform .
The legal standard is high: Ofcom must demonstrate that the platform has failed to comply with its duties, that direct enforcement has proven ineffective, and that the proposed blocking action is proportionate and consistent with human rights obligations . But the powers exist, and they represent a credible threat of total UK market exclusion.
How UK Powers Compare to the EU
The EU's Digital Services Act (DSA), which took full effect in February 2024, offers a parallel but structurally different enforcement model. The DSA imposes fines of up to 6% of global annual turnover — compared to the UK's 10% . For repeated violations, the European Commission can impose periodic penalty payments of up to 5% of average daily worldwide turnover .
The DSA's strictest obligations apply to Very Large Online Platforms (VLOPs) — those with more than 45 million monthly active EU users. Telegram has not been designated as a VLOP, likely because its EU user base falls below this threshold . This means Telegram faces national-level enforcement from individual EU member states' Digital Services Coordinators rather than direct EU Commission oversight.
Both regimes allow for platform blocking as a last resort. Under the DSA, Digital Services Coordinators can request temporary service restrictions when infringements persist and cause serious harm, including criminal offences involving threats to persons' life or safety . The UK's service restriction orders operate through court applications and can target ISPs, app stores, and payment providers .
A key difference is institutional: the UK channels all enforcement through a single regulator (Ofcom), while the EU distributes responsibility across 27 national coordinators plus the Commission for VLOPs. Neither regime currently provides for criminal referrals of platform executives as a standard enforcement tool, though individual member states — as France demonstrated with Durov's arrest — can pursue criminal prosecution under domestic law .
Competing Perspectives
The investigation crystallizes a tension between two positions, each backed by legitimate concerns.
The case for enforcement: Child safety organizations, including the NSPCC and IWF, argue that Telegram has had years of warnings and has moved only under extreme pressure — notably Durov's arrest. The platform's architecture allows groups of up to 200,000 members to share content with minimal moderation. The Online Safety Act was designed precisely for this scenario: platforms that profit from user engagement while failing to prevent the most serious categories of illegal content. The IWF's record of 291,273 actioned reports in 2024 alone demonstrates the scale of the problem across the internet , and Telegram's late adoption of basic detection tools like IWF hash-matching suggests it was not treating child safety as a priority .
The case for caution: Telegram and digital rights advocates argue that aggressive regulation of messaging platforms risks undermining privacy protections relied upon by journalists, dissidents, and human rights workers worldwide. Telegram's statement that it has "virtually eliminated the public spread of CSAM" since 2018 suggests the company views the remaining problem as concentrated in private spaces where detection inherently conflicts with user privacy . The platform's post-arrest reforms — joining the IWF, enabling private chat reporting, sharing data with courts — represent genuine steps. Overly punitive action could drive users to platforms with no moderation at all, or to dark web services where law enforcement has even less visibility.
What Comes Next
Ofcom's investigation is in its early stages. The regulator will gather and analyze evidence before issuing a provisional decision, to which Telegram will have the opportunity to respond . If Ofcom finds a compliance failure, it can require specific remedial steps, impose fines, or escalate to court-ordered business disruption measures.
The investigation's scope and outcome will be shaped by questions that remain open: how much CSAM is present on Telegram's non-encrypted channels, whether Telegram's post-arrest reforms have been effective, and whether the platform can do more within the technical constraints of its architecture. The answers will matter not only for Telegram but for every messaging service operating in the UK under the Online Safety Act's expanding enforcement regime.
Related Stories
UK PM Starmer Warns Tech Executives That Current Online Safety Approach Is Unsustainable
Elon Musk Backs UK Campaign to Repeal Online Safety Act
UK Government Summons Social Media Executives to Discuss Children's Online Safety
Meta Ends End-to-End Encryption in Instagram Direct Messages
Greece Announces Social Media Ban for Children Under 15
Sources (25)
- [1]Ofcom investigates Telegram and teen chat sitesofcom.org.uk
Ofcom has opened formal investigations into Telegram, Teen Chat, and Chat Avenue over concerns they are failing to prevent the spread of CSAM and protect minors from online grooming.
- [2]Online Safety Act: Illegal content duties are now in forcecms-lawnow.com
Platforms must start tackling illegal material from 17 March 2025 under the Online Safety Act's illegal content duties.
- [3]Telegram Users Statistics 2026demandsage.com
Telegram crossed 1 billion monthly active users in March 2025. Approximately 10% of the UK population regularly uses Telegram.
- [4]After X and Grok, Ofcom opens child safety investigation into Telegramthenextweb.com
Ofcom has opened investigations into nearly 100 services since the Online Safety Act came into force, issued nearly a dozen fines, and in March 2026 wrote to six major platforms demanding child safety improvements.
- [5]Meta Integrity Reports, First Quarter 2024transparency.meta.com
In Q1 2024, Facebook and Instagram sent over 5.2 million NCMEC CyberTip reports for child sexual exploitation.
- [6]The Work Never Stops: A First Look at NCMEC's 2025 Datamissingkids.org
In 2025, the CyberTipline received 21.3 million reports that included more than 61.8 million images, videos and other files related to suspected child sexual exploitation.
- [7]IWF Annual Data and Insights Report 2024iwf.org.uk
In 2024, IWF analysts assessed 424,047 reports and actioned 291,273 for removal. 97% of reports where sex was recorded showed the sexual abuse of girls.
- [8]AI CSAM Report 2026: Harm Without Limitsiwf.org.uk
In 2025, the IWF identified 3,443 AI-generated child sexual abuse videos, a 26,385% increase compared to 2024.
- [9]Online Safety Act in force: platforms must start tackling illegal material from 17 March 2025scl.org
Online platforms must implement measures to remove illegal content quickly and reduce the risk of priority criminal content from appearing.
- [10]UK probes Telegram and other chat apps over child safety failurescyberinsider.com
Ofcom can impose fines of up to £18 million or 10% of qualifying worldwide revenue. The enforcement process begins with evidence gathering, followed by provisional and final decisions.
- [11]Telegram Tracks for First Profitable Year With $1 Billion Revenuefinance.yahoo.com
Telegram achieved profitability for the first time in 2024, reporting net profit of $540 million on $1.4 billion revenue.
- [12]Telegram Revenue Surges 65% to $870M in H1 2025finance.yahoo.com
Telegram earned revenue of $870 million during H1 2025, a 65% increase year-over-year, with projections of $2 billion for full-year 2025.
- [13]TikTok fined £1.875m for providing inaccurate data on safety controlsofcom.org.uk
Ofcom issued a final decision to TikTok on 23 July 2024 imposing a penalty of £1,875,000 after the company provided inaccurate data.
- [14]Do messaging apps with end-to-end encryption reduce CSAM detection?factually.co
Standard Telegram messages are not end-to-end encrypted. Only Secret Chats use E2EE. Hash-matching remains the most reliable CSAM detection method.
- [15]OpenAlex: Research Publications on CSAM Online Detectionopenalex.org
Over 13,800 academic papers published on child sexual abuse material online detection since 2011, peaking at 2,131 papers in 2023.
- [16]Telegram Child Safety Measures: Platform Partners with Internet Watch Foundationsurvivorsrights.com
The NSPCC argued there should be no part of the service where perpetrators can act without detection.
- [17]Arrest and indictment of Pavel Duroven.wikipedia.org
Pavel Durov was arrested on August 24, 2024 and indicted on twelve charges including complicity in the distribution of child exploitation material.
- [18]Telegram U-turns and joins child safety schemeiwf.org.uk
In December 2024, Telegram joined the Internet Watch Foundation for hash-matching CSAM detection on public content.
- [19]Telegram has now teamed up with UK child safety groupsiliconrepublic.com
Telegram partnered with the IWF, but the partnership covers only public portions of the platform, leaving private groups unaddressed.
- [20]Telegram ignored outreach from child safety watchdogs before CEO's arrestiwf.org.uk
The IWF stated that Telegram ignored outreach from child safety watchdogs before Durov's arrest in August 2024.
- [21]France fully lifts travel ban on Telegram founder Durovfrance24.com
France lifted all travel restrictions on Durov on November 13, 2025. The criminal investigation remains ongoing.
- [22]Dark Web Statistics 2025deepstrike.io
Operation RapTor in May 2025 seized $200 million and arrested 270 suspects, yet dark web marketplaces quickly re-emerge.
- [23]Ofcom's enforcement powers under the Online Safety Actonlinesafetyact.net
Ofcom can apply for service restriction orders compelling ISPs to block access, app stores to remove apps, and payment providers to withdraw services.
- [24]Enforcement under the Online Safety Actbristows.com
Courts can issue service restriction orders against ISPs, domain registrars, DNS providers, search engines, and app stores.
- [25]Enforcement and Penalties under the EU Digital Services Actedaa.eu
DSA fines of up to 6% of global annual turnover. Telegram is not designated as a VLOP under the DSA.
Sign in to dig deeper into this story
Sign In