Ofcom Opens Formal Investigation into Telegram Over Child Sexual Abuse Concerns
April 21, 2026 - 9:50 am
The UK’s online safety regulator, Ofcom, has initiated a formal investigation into Telegram under the Online Safety Act 2023. The inquiry focuses on whether the messaging platform has effectively fulfilled its legal obligations to safeguard UK users from child sexual abuse material (CSAM). This development marks a significant moment in Ofcom's enforcement of the Act against prominent messaging services.
The investigation follows Ofcom’s standard procedure under the Online Safety Act, which requires user-to-user and search services to identify and mitigate risks associated with illegal content, including CSAM, and promptly remove it.
Ofcom possesses substantial fines as a consequence of non-compliance, capped at £18 million or 10% of a company's worldwide qualifying revenue. In cases of persistent non-compliance, they can also request courts to implement business disruption measures, potentially leading to the blocking of the platform within the UK.
It’s important to note that the initiation of a formal investigation does not imply guilt; Ofcom first gathers and analyzes evidence before reaching a conclusion. After receiving a provisional decision, the company has the chance to respond thoroughly prior to any final determination. This process typically spans several months.
Telegram's relationship with UK regulators has seen shifts recently. In December 2024, they joined forces with the Internet Watch Foundation (IWF), committing to implement IWF’s detection tools across public areas of the platform. Despite these efforts, Ofcom believes there are valid reasons to conduct a thorough probe into Telegram's adherence to CSAM-related duties under the Act.
This new investigation stands in contrast to Ofcom's 2026 annual review, which acknowledged improvements from platforms like Telegram, X, Discord, and Reddit regarding age controls implemented in response to the Online Safety Act.