TikTok took down more than half a million videos in Kenya between April and June 2025 as part of efforts to enforce its Community Guidelines and maintain a safe digital environment for users, the company has revealed in its latest transparency report.
According to the TikTok Community Guidelines Enforcement Report for Quarter 2 (April–June 2025), 592,037 videos in Kenya were removed for violating content policies, most of them before they could reach audiences.
Of these takedowns, 92.9% were removed before they received any views, while 96.3% were taken down within 24 hours, showing an increasingly proactive moderation approach by the platform.
Globally, TikTok removed over 189 million videos during the same period, representing only 0.7% of total uploads.
&format=jpeg)
A digital pop-art collage of a creator holding a smartphone and a mini tripod, symbolising the modern hustle of TikTok content creation.
The report notes that 99.1% of global removals were detected proactively, and 94.4% happened within 24 hours.
In addition, 163.9 million videos were automatically flagged and removed by TikTok’s AI systems.
The platform also removed 76.9 million fake accounts globally and an extra 25.9 million accounts suspected to belong to users under 13, reinforcing its efforts to combat inauthentic behaviour and protect young users.
Moderation boosted by AI and human reviewers
TikTok says its focus on combining AI tools with human reviewers continues to improve safety and reduce harmful content exposure.
The company reports that more than 90% of violative content is now removed before getting a single view, and that moderators saw a 76% drop in exposure to graphic content over the past year, thanks to better automated detection systems
The company emphasised that thousands of trust and safety professionals work alongside its technology to uphold community standards and protect users from threats such as hate speech, misinformation, and harmful behaviour.
Tiktok
Crackdown on LIVE violations
For the first time, TikTok also shared enforcement data related to LIVE streams.
The platform issued warnings and demonetised or suspended monetisation for 2.32 million LIVE sessions and 1.04 million LIVE creators who violated its LIVE monetisation guidelines.
“These guidelines help reward creators who stream safe, authentic, and high-quality content,” TikTok said, adding that most actions begin with warnings to educate creators on compliance.
Encouraging user reporting
Beyond automated systems, TikTok urged users to continue flagging harmful content through in-app reporting tools as part of a shared responsibility for online safety.
Continued focus on safety
The company maintains that safety remains at the core of its content policies and enforcement approach, with ongoing improvements to ensure a positive experience on the platform.
“We prioritise safety, well-being, and integrity so that our community can feel free to create, make connections, and be entertained,” TikTok said in the report.
Kenya has seen a surge in TikTok usage, especially among young people, prompting national discussions around safety, misinformation and parental oversight.
The platform’s new numbers show content moderation remains a major challenge, but also that enforcement is becoming faster and more proactive.


&format=jpeg)
)
&format=jpeg)
&format=jpeg)
&format=jpeg)
&format=jpeg)