TikTok, the popular short-video platform owned by ByteDance, reported removing 2.1 million videos posted by Nigerian users in the third quarter of 2024 for violating its content policies.
This puts Nigeria among the top 50 countries where policy-violating content originated during the period. Globally, the platform took down 147.8 million videos, representing a significant push toward enforcing its community guidelines.
TikTok’s Q3 2024 Community Guidelines Enforcement report highlights that the top 50 markets accounted for approximately 90% of all content removals.
Violations spanned categories such as Integrity and Authenticity, Privacy and Security, Mental and Behavioral Health, and Safety and Civility.
“Content moderation is an ongoing effort to maintain a safe environment for all users,” the company stated. “We remain committed to removing harmful or inappropriate content promptly.”
Beyond video removals, TikTok reported removing a staggering 214.8 million accounts during the quarter, with fake accounts leading the tally at 187.3 million.
Additionally, 24.3 million accounts suspected to belong to users under the age of 13 were also deleted, reflecting the platform’s ongoing effort to comply with child protection standards. Another 3.2 million accounts were removed for unspecified reasons.
READ ALSO: EU descends on Tiktok, YouTube, Snapchat over harmful contents
Engagement metrics were also scrutinized. TikTok removed 1.3 billion comments, 1.1 billion video likes, and 57.2 million fake followers, all linked to inauthentic or automated activities. Moreover, 12.2 million live sessions were suspended for guideline violations.
“Protecting the integrity of our platform is a top priority,” the report noted. “We continue to refine our systems to detect and remove accounts, content, and engagement that exploit automated or inauthentic mechanisms.”
TikTok reported a reduction in the number of advertisements removed for policy violations in Q3 2024, with 1.9 million ads taken down compared to 2.2 million in the previous quarter. These removals included actions taken at the account level or directly targeting ad content.
“We continually strengthen our review systems to identify patterns and ensure compliance with our Advertising Policies, Community Guidelines, and Terms of Service,” TikTok stated.
Despite its enforcement efforts, TikTok continues to face scrutiny worldwide over concerns about user safety, especially for younger audiences.
In October 2024, 13 U.S. states and Washington, D.C., filed lawsuits against the company, alleging it exploits children’s vulnerabilities to boost profits.
The lawsuits accuse TikTok of designing its platform to be addictive, prioritizing engagement over user well-being.
The lawsuits seek financial penalties and demand stricter content moderation and accountability from the Chinese-owned platform. Critics argue that TikTok’s software fosters excessive screen time among children and raises significant mental health concerns.
In Nigeria, reactions to TikTok’s enforcement efforts have been mixed. Some users applaud the platform for taking steps to uphold its policies, while others feel the crackdown has disproportionately affected their content.
“I understand the need for moderation, but sometimes it feels like genuine content gets flagged,” says Adeola Ogun, a Nigerian TikTok creator. “It’s important to distinguish between harmful and harmless content.”
Experts in digital policy emphasize the importance of transparency in content moderation. “Platforms like TikTok must strike a balance between enforcement and fairness,” says Dr. Chinedu Adebayo, a tech policy analyst. “Clear communication with users about policy violations is essential.”