Chinese Firm Uses Employee Data to Build AI Worker, Stoking Job Security Debate
Listen to the full version

A gaming and media company in East China’s Shandong province has sparked heated discussions about skill distillation after using a former employee’s chat logs, work documents and decision-making habits to train an artificial intelligence avatar to do his job.
Skill distillation is a specialized machine learning method that transfers specific functional behaviors, decision-making procedures or structured reasoning strategies from a large, complex “teacher” model into a smaller, more efficient “student” model.
Unlock exclusive discounts with a Caixin group subscription — ideal for teams and organizations.
Subscribe to Save an extra $50. Introductory offer for new readers. Subscribe now.
- DIGEST HUB
- Shandong gaming company trained AI avatar using ex-employee's chat logs, documents, and habits via skill distillation, sparking controversy.
- Trend mirrors tools like GitHub's colleague.skill; AI excels over RPA by contextual thinking, automating reports and decisions.
- Raises privacy issues, data consent concerns, job displacement risks across clerical, managerial, executive roles.
- 53AI
- **53AI** founder Yang Fangxian told Caixin that AI-trained skills surpass RPA by evaluating contexts—like identifying meeting types for templates—and thinking like humans. They generate memos with business knowledge, spot issues, and offer solutions, shifting human value to complex decisions.
- Goldman Sachs
- A Goldman Sachs report indicated that entry-level clerical and administrative roles are most susceptible to AI automation.
- McKinsey & Company
- A McKinsey report argued that AI is flattening organizational structures by taking over communication and oversight functions of middle-level managers.
- DeBund Law Offices
- DeBund Law Offices, a Shanghai-based firm, features senior partner You Yunting, who highlighted privacy risks in skill distillation. He noted companies can claim work products but not use personal chat logs, emails, or behavioral data for AI training without consent, protecting portrait, voice, and data rights.
- PODCAST
- MOST POPULAR





