According to data presented by the Atlas VPN team based on TikTok’s Community Guideline Enforcement report, TikTok removed 106,476,032 videos for violations in Q2 2023.
The platform also wiped a total of 107,917,818 accounts in Q2 2023. Notably, most removed accounts belonged to users under 13, in line with the minimum age requirements to create an account on the platform.
A rise in removals comes amidst concerns over TikTok’s ability to protect its users from harmful content and exploitation. The Data Protection Commission recently found that in the latter half of 2020, TikTok’s default settings did not do enough to protect children’s accounts, resulting in a €345 million fine.
The results of Q2 2O23 show a noticeable 19% uptick from the previous quarter (91,003,510 videos removed) and a 26% increase compared to Q4 2022 (85,860,819 videos removed).
An increase in removals could be connected with several revisions to TikTok’s community guideline policy since April 2023, following discussions that the platform should be banned in the United States for national security. Subsequent updates to the policy were released in May and August.
Mature themes — biggest offender
Of all the nearly 107 million videos removed in Q2 2023, almost 39.1% contained sensitive and mature themes, such as nudity and body exposure or graphic images. Thankfully, moderators deleted around 83.1% of all these videos before they had a single view.
Regulated goods and commercial activities were the second-largest deletion category, comprising 28% of all removals. This ranges from consuming and promoting drugs, alcohol, and tobacco to conducting scams or fraud.
Safety and civility violations — such as bullying, hate speech, and youth exploitation — round out the top three, equal to 14.5% of all cases. This is closely followed by the mental and behavioral health category, which was the main reason for removal 10.1% of the time.
Privacy and security were slightly less common, with content featuring personal information warranting removal in only 7.1% of all cases. The remaining 1.2% was covered by integrity and authenticity violations, such as spreading misinformation or paid political content.
While TikTok has taken steps to address safety concerns, there is a need for more substantial and consistent efforts to ensure a safe environment for its vast user base. The ongoing issues with content removal and user protection indicate that the platform may need to invest more in proactive measures to address these critical safety issues effectively.