ByteDance said to part ways with key leader for Douyin content

ByteDance said to part ways with key leader for Douyin content

Tech in Asia·2025-06-09 20:00

Chinese tech giant ByteDance has reportedly parted ways with Li Tong, a senior executive overseeing content moderation and data labeling for its China-focused apps.

This information comes from various sources and media reports, indicating that Li is no longer listed in the company’s internal employee system.

Li led the Content Quality and Data Service (CQC) team under Douyin Group, which was established in 2017.

The CQC team is responsible for monitoring content quality and user experience across more than ten products, including Douyin, the Chinese version of TikTok, and the news aggregator Jinri Toutiao.

Additionally, Li served as deputy editor-in-chief for Toutiao, as reported by the Chinese media outlet The Paper.

ByteDance has not issued a statement regarding Li’s departure.

In China, platform operators like ByteDance are legally required to censor content deemed illegal.

The CQC team plays a vital role in ensuring compliance with these regulations.

.source-ref{font-size:0.85em;color:#666;display:block;margin-top:1em;}a.ask-tia-citation-link:hover{color:#11628d !important;background:#e9f6f5 !important;border-color:#11628d !important;text-decoration:none !important;}@media only screen and (min-width:768px){a.ask-tia-citation-link{font-size:11px !important;}}

🔗 Source: South China Morning Post

🧠 Food for thought

1️⃣ Content moderation executives hold critical compliance roles in China’s tech ecosystem

Li Tong’s departure highlights the crucial role senior content moderation executives play in China’s tightly regulated tech industry.

With Chinese platform operators legally responsible for censoring “illegal” content, these executives occupy high-stakes positions that directly impact regulatory compliance.

The Chinese government has invested approximately $6.6 billion since 2018 on monitoring and controlling online content, creating an environment where content moderation failures can trigger severe legal consequences for companies 1.

ByteDance’s Content Quality and Data Service team was specifically established to handle content that “cannot be spotted by machine,” reflecting the sophisticated human judgment required to navigate China’s complex censorship requirements.

This human-led moderation approach is necessary in a country consistently ranked by Freedom House as “Not Free” in terms of internet freedom, with some of the world’s most extensive censorship infrastructure 2.

2️⃣ Scale of content moderation creates immense operational challenges

ByteDance faces extraordinary content moderation challenges given the massive scale of its platforms, with Toutiao alone reporting 700 million users by 2018 3.

The company’s Content Quality team must monitor sensitive content across more than 10 different products simultaneously, requiring significant human and technological resources.

China’s regulatory environment demands comprehensive content screening systems that can identify and remove sensitive information about political issues, COVID-19 discussions, and other topics deemed problematic by authorities 2.

This operational burden creates a tension between business growth and compliance requirements, as platforms must invest heavily in content moderation systems while maintaining user engagement.

Recent ByteDance developments

……

Read full article on Tech in Asia

Technology