TikTok has taken tough action against violators of its terms of service in Kenya, deleting over 450,000 videos and banning 43,000 accounts in the first quarter of 2025. This is part of TikTok’s ongoing efforts to uphold its community guidelines and keep the online space safe for its users.
In its first-ever content moderation compliance report, TikTok revealed that these videos and accounts were removed for various reasons, all under its community guidelines. These guidelines cover a range of issues including harmful content, harassment, nudity, misinformation and concerns about the safety of minors.
What Kind of Content Was Removed?
TikTok’s guidelines are designed to ensure the platform is a safe and enjoyable space for all users. In Kenya, the deleted videos were flagged for violating some of these specific rules:
Harmful content that promotes violence or encourages dangerous behavior
Harassment and bullying, including content that targets individuals or groups
Nudity and sexual content that violates the platform’s standards for appropriate material
Misinformation that could mislead users or spread false narratives
Safety concerns related to minors, as TikTok requires users to be at least 13 years old to use the platform
Fast Moderation: Videos Removed Before Being Viewed
One of the key findings in TikTok’s report is how fast the platform removes content that violates its rules. In Kenya, a whopping 92.1% of the flagged videos were taken down before anyone had viewed them.
This proactive approach is part of TikTok’s effort to prevent harmful content from spreading. And even more impressive, 94.3% of violating content was removed within 24 hours of being uploaded. That’s how committed the company is to addressing problematic content and keeping the community safe.
Account Bans and Automated Moderation
Alongside the massive video removal, TikTok also banned over 43,000 accounts in Kenya in the first quarter of 2025 for violating the rules. This shows the company is serious about enforcing its policies and users following the guidelines. TikTok’s ability to detect and act on violations is powered by automated moderation tools which have helped the platform achieve a global 99% detection rate. These tools analyze millions of pieces of user generated content to quickly identify videos or accounts that violate community standards so TikTok can enforce its rules better.
TikTok’s Global Content Moderation Efforts
While the numbers in Kenya are part of a global trend, TikTok’s report shows the company’s commitment to content moderation and community management. The platform is working to improve its automated tools and human review processes to detect more violations and remove inappropriate content before it spreads.
As the platform grows, TikTok’s focus on content moderation and community safety will create a healthier digital space for all users.