The platform says 99.7% of removed content was proactively identified and 96.2% was removed within 24 hours of being posted.
TikTok removed more than 25.4 million videos in Pakistan between April and June 2025 for violating its Community Standards, according to the platform’s Q2 2025 Community Standards Enforcement Report.
The short-form video app said 99.7% of removed content was identified proactively and 96.2% was removed within 24 hours of being posted.
Globally, TikTok removed 189.5 million videos during the quarter, representing about 0.7% of all uploads. Of those, 163.9 million were removed using automated detection tools, while 7.4 million were later reinstated after additional review.
The company also removed 76.9 million fake accounts and 25.9 million accounts suspected of belonging to users under 13 years of age.
According to the report, 30.6% of the removed videos contained sensitive or adult themes, 14% violated safety and civility standards, and 6.1% violated privacy and security policies. Additionally, 45% of content was flagged for misinformation, while 23.8% included edited or AI-generated media.
TikTok said the quarterly report underscores its continued efforts to ensure a safe digital environment and maintain transparency. “The regular publication of compliance reports reflects our commitment to transparency and community safety,” the company said.
Read: The Senate presents a bill to ban the social media accounts of minors under 16 years of age
Similarly, during the first quarter of 2025, TikTok removed nearly 25 million videos in Pakistan, according to its Q1 2025 Community Guidelines Enforcement Report, which covers activity from January to March.
According to the report, a total of 24,954,128 videos were removed in Pakistan for violating the platform’s community guidelines. The proactive removal rate in the country remained exceptionally high at 99.4%, with 95.8% of flagged videos removed within 24 hours of being posted.
The report further revealed that 30.1% of all videos removed worldwide contained sensitive or adult themes, making it the most common reason for enforcement.
Other violations included violations of privacy and security guidelines (15.6%), safety and civility standards (11.5%), misinformation (45.5%), and the use of edited media or AI-generated content (13.8%).
TikTok said its quarterly compliance reports are part of its ongoing commitment to transparency and accountability. The company noted that the reports are designed to help users, regulators and the general public better understand how content moderation is carried out at scale and what types of violations are most frequently addressed.