Floating Button
Home News Digital Economy

IMDA issues Letters of Caution to X and TikTok for CSEM and terrorism

The Edge Singapore
The Edge Singapore  • 3 min read
IMDA issues Letters of Caution to X and TikTok for CSEM and terrorism
IMDA has issued Letters of Caution to X and TikTok for serious weaknesses in their measures to proactively detect and remove child sexual exploitation and abuse material and terrorism respectively.
Font Resizer
Share to Whatsapp
Share to Facebook
Share to LinkedIn
Scroll to top
Follow us on Facebook and join our Telegram channel for the latest updates.

The Infocomm Media Development Authority (IMDA) has issued Letters of Caution to X and TikTok for serious weaknesses in their measures to proactively detect and remove child sexual exploitation and abuse material (CSEM) and terrorism content respectively. They have also been placed under enhanced supervision.

IMDA found a 120% increase in cases of CSEM on X originating from or targeting Singapore users, up from 33 cases in 2024 to 73 cases in 2025.

On TikTok, IMDA found 17 cases of terrorism content shared by Singapore-based accounts for the first time in 2025.

CSEM and terrorism content are egregious harms, and the Code of Practice for Online Safety – Social Media Services requires designated Social Media Services (DSMSs) to proactively detect and swiftly remove CSEM and terrorism content, through the use of technologies and processes, before users encounter such content, IMDA says.

As part of its assessment for the report, IMDA conducted tests for CSEM and terrorism content to obtain an indicative sample of such content across the DSMSs. The 73 CSEM cases detected on X all had a Singapore nexus, and involved content sharing or linking to CSEM, as well as self-generated CSEM. This occurred despite IMDA sharing its analysis of the CSEM cases and their indicators with X in 2024. All 73 cases also violated X’s own policies against CSEM and were only removed by X when IMDA flagged the cases to X.

IMDA detected 17 cases of terrorism content shared by Singapore-based accounts on TikTok. These cases primarily comprised videos with edited footage or audio related to known transnational terrorist organisations. In addition, when some of these were reported to TikTok via its in-app user reporting mechanism, TikTok found that the content did not violate its community guidelines. This demonstrated that TikTok did not accurately assess the terrorism content when they were user reported. TikTok only removed them when IMDA flagged the cases to TikTok.

See also: Singapore tech spending to grow 6% this year as talent crunch, rising costs weigh on returns

Both X and TikTok have accepted IMDA’s findings and committed to put in place specific measures to rectify these serious weaknesses, in particular to enhance their automated detection systems through the use of AI and incorporation of additional signals to improve their proactive detection of CSEM and terrorism content respectively.

To ensure accountability for the effective implementation of these rectification measures, IMDA has issued Letters of Caution to X and TikTok to place both services under enhanced supervision and require them to provide regular updates to IMDA on their progress in implementing the rectification measures they have committed to, until IMDA is satisfied that the issues are adequately.

Both X and TikTok are to provide regular updates to IMDA on their progress in implementing the rectification measures they have committed to, until IMDA is satisfied that the issues are adequately resolved. Both social media companies have to also provide supporting data and information to IMDA in their next annual online safety report due on June 30 to demonstrate the effectiveness of their implementation of the rectification measures.

IMDA has warned that should X or TikTok fail to satisfy IMDA that they have improved the effectiveness of their measures to address the specific types of CSEM and terrorism content that IMDA has detected, IMDA will not hesitate to explore further options, including potential regulatory action under the Broadcasting Act.

×
The Edge Singapore
Download The Edge Singapore App
Google playApple store play
Keep updated
Follow our social media
© 2026 The Edge Publishing Pte Ltd. All rights reserved.