Skip to content

Contents Moderators Employed in the UK Face Uncertainty Due to Shift in TikTok's AI Moderation Strategy

TikTok's UK positions pertaining to trust and safety may diminish due to the growth of AI moderation, potentially elevating concerns about user safety under the latest stipulations of the Online Safety Act.

TikTok Alters AI System, Potentially Endangering UK Content Moderators
TikTok Alters AI System, Potentially Endangering UK Content Moderators

Contents Moderators Employed in the UK Face Uncertainty Due to Shift in TikTok's AI Moderation Strategy

The UK's Information Commissioner's Office has launched a "major investigation" into TikTok's data practices, as the popular social media platform announces significant job cuts from its UK trust and safety division.

The Communication Workers Union (CWU) has criticized the job cuts, calling them a prioritization of corporate greed over worker and public safety. Union representatives argue that human moderators are uniquely equipped to catch subtleties that may evade detection by algorithms.

The CWU notes the troubling timing of the announcement, as TikTok workers were preparing to vote on union recognition. The job cuts place several hundred jobs in the UK at risk, following the layoff of 150 content moderators in Berlin.

TikTok's restructuring aligns with a broader industry trend in which tech giants have been shrinking their human moderation teams in favor of automated systems. More than 85% of harmful or policy-violating content on TikTok is already removed automatically through AI.

However, the greater question is whether automation alone can meet the rising bar of safety, cultural sensitivity, and accountability demanded by regulators and users alike. The UK's Online Safety Act, enforced from July 2025, requires platforms to implement robust age checks and actively remove harmful material.

The UK's Online Safety Act enforces oversight of TikTok's data practices, algorithms, and moderation to ensure compliance with national security standards. TikTok has reduced its London moderation team and shifted moderation tasks to regional hubs and automation, in the context of these regulations.

Failure to demonstrate compliance with the UK's Online Safety Act could result in financial penalties and reputational damage for TikTok. Breaches of the Act can result in fines of up to 10% of global turnover.

The job cuts are part of a global trend that has affected operations in Berlin, the Netherlands, and Malaysia. TikTok's geopolitical scrutiny magnifies concerns about its Chinese parent company, ByteDance, especially regarding data governance and content manipulation.

Any misstep in moderation risks becoming part of a larger narrative about TikTok's ability-or inability-to safeguard democratic and social norms. Workers are concerned that users, especially minors, will be exposed to greater risks if AI becomes the first and last line of defense.

The UK's Office of Communications (Ofcom) will play a key role in regulating TikTok's compliance with the Online Safety Act. The ongoing investigation into TikTok's data practices will undoubtedly shed light on the platform's commitment to user safety and privacy.