Chinese social media app TikTok has named a group of external technology and safety experts as the founding members of its content moderation committee, the latest move in an attempt to ease U.S. concerns over its data security and potential for blocking or deleting content at Beijing’s request.
The committee, officially known as the Content Advisory Council, will be tasked with advising on and shaping the app’s content policies related to child safety, hate speech, misinformation, bullying and other potential issues, TikTok said in a statement on Wednesday.
The council will be chaired by Dawn Nunziato, a professor at George Washington University Law School and co-director of the Global Internet Freedom Project. Nunziato specializes in areas of free speech and content regulation, according to the statement.
Other members include Rob Atkinson, a technology policy expert; Hany Farid, who has insights into digital images, video forensics and deep fakes; Dan Schnur, a political communications expert; and Vicki Harrison, a social worker who specializes in child safety issues and holistic youth needs.
The council members will meet with TikTok’s U.S. leadership for their first conference at the end of March to discuss topics around platform integrity including policies against misinformation and election interference, the company said.
ByteDance-owned TikTok, also known as Douyin in China, has made concerted efforts to boost transparency and improve its content review mechanisms recently as some U.S. lawmakers have voiced concerns that the Chinese app deletes content at the behest of the Chinese government. TikTok has long denied such allegations, arguing that Beijing has no jurisdiction over content published on its platform outside of China.
Earlier this month, TikTok announced plans to open a content moderation transparency center in its Los Angeles office to show external experts how it reviews content and processes concerns from users and content creators.
Contact reporter Ding Yi (firstname.lastname@example.org)