Chinese social media platform TikTok took down over 450,000 videos in Kenya between January and March 2025 for violating its Community Guidelines. More than 43,000 local accounts were also banned during the period.
TikTok’s takedown figures show the scale of harmful or non-compliant content on the app, and how aggressively the platform is trying to control it. With millions of Kenyans using TikTok daily, the way it moderates content has real implications for public discourse, misinformation, online harm, and mental health.
Most of the flagged videos, 92.1%, were removed before anyone saw them, while 94.3% were taken down within 24 hours. Globally, TikTok says it now detects 99% of harmful content proactively. The company credits this to a mix of automation and human review.
LIVE streams remain a key focus. Over 19 million LIVE sessions were shut down globally in Q1, a 50% increase. TikTok says its moderation accuracy has improved, despite steady appeal rates.
In Kenya, TikTok is also stepping up mental health efforts. It now offers in-app access to Childline Kenya for users who report suicide, self-harm, hate, or harassment-related content. A partnership with Mental360 launched in June brings local, evidence-based mental health content to the platform.
TikTok also named Dr. Claire Kinuthia as one of its African Mental Health Ambassadors to help improve access to credible information.
All of this comes as Kenyan authorities and civil society increase scrutiny of TikTok’s influence, especially among young users.
