TikTok executives know about app’s effect on teens, lawsuit documents allege
NPRTikTok executives know about app’s effect on teens, lawsuit documents allege toggle caption Sebastien Bozon/AFP via Getty Images For the first time, internal TikTok communications have been made public that show a company unconcerned with the harms the app poses for American teenagers. “Unfortunately, this complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.” He continued: “We have robust safeguards, which include proactively removing suspected underage users, and we have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16.” Sponsor Message Kentucky AG: TikTok users can become ‘addicted’ in 35 minutes As TikTok’s 170 million U.S. users can attest, the platform’s hyper-personalized algorithm can be so engaging it becomes difficult to close the app. Sponsor Message TikTok exec: algorithm could deprive kids of opportunities like ‘looking at someone in the eyes’ Publicly, TikTok has stated that one of its “most important commitments is supporting the safety and well-being of teens.” Yet internal documents paint a very different picture, citing statements from top company executives who appear well-aware of the harmful effects of the app without taking significant steps to address it. “The intensive density of negative content makes me lower down mood and increase my sadness feelings though I am in a high spirit in my recent life.” Another employee said, “there are a lot of videos mentioning suicide,” including one asking, “If you could kill yourself without hurting anybody would you?” In another document, TikTok’s research found that content promoting eating disorders, often called “thinspiration,” is associated with issues such as body dissatisfaction, disordered eating, low self-esteem and depression Despite these heedings, TikTok’s algorithm still puts users into filter bubbles. One internal document states that users are “placed into ‘filter bubbles’ after 30 minutes of use in one sitting.” The company wrote that having more human moderators to label content is possible, but “requires large human efforts.” TikTok’s content moderation missing self-harm, eating disorder content TikTok has several layers of content moderation to weed out videos that violate its Community Guidelines.