Today, representatives from TikTok, Snapchat, and YouTube testified before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security about how to keep children safe online. This hearing comes in the wake of Facebook whistleblower Frances Haugen’s document disclosures to the Wall Street Journal, which revealed Facebook’s awareness that Instagram is harmful to adolescent females, among other things.
According to Facebook’s own study, 32% of young females reported that Instagram made them feel even worse about their bodies when they were unhappy with their appearance. However, as the Senate works to hold Facebook accountable for its effect on underage females, senators recognize that the problem is bigger than Mark Zuckerberg. Senators provided information from constituents of teens on these platforms who have nonetheless suffered from diseases like anorexia and bulimia, despite the fact that the firms that testified today all have standards forbidding content that encourages eating disorders.
“As a teenager, my office made a YouTube account.” We viewed a few films regarding eating disorders and excessive diets. Senator Blumenthal (D-CT), the committee’s head, stated in his opening remarks, “They were simple to discover.” According to him, the account subsequently fed eating disorder-related information in its suggestions. “It’s impossible to get out of this rabbit hole.”
TikTok was also a source of concern for Blumenthal’s team. The Wall Street Journal performed a similar experiment on the network, generating 31 bot accounts — users — ranging in age from 13 to 15. Despite TikTok’s restriction on content encouraging eating disorders, the accounts investigated nonetheless offered multiple such videos, according to the magazine.
Senator Amy Klobuchar (D-MN) approached TikTok’s head of Public Policy for the Americas, Michael Beckerman, and asked if TikTok had ceased pushing, content to minors that praises eating disorders, drugs, and violence. Beckerman stated that while he disagrees with the Wall Street Journal’s approach for the trial — the users were bots instructed to search for and linger on specific material — TikTok has improved the way users could manage the algorithm and view age-appropriate content.
According to Beckerman, anything relating to drugs violates community norms, and 97 percent of the information that breaches minor safety policies is deleted proactively. These figures correspond to those in a newly released transparency report, which details how the material was deleted from the site between April and June 2021.
According to the research, 97.6% of content that violated minor safety standards was deleted proactively before viewers reported it, and 93.9 percent of those videos were removed with zero views. 94.2 percent of films in the category of “suicide, self-harm, and harmful activities” — which includes content praising eating disorders — were proactively deleted, while 81.8 percent of videos had no views.
Senator Klobuchar went on to question Beckerman if TikTok had performed any study into how the site may push content to teenagers that promote eating disorders, and if Beckerman had requested any internal studies on eating disorders before testifying. In response to both inquiries, he responded no but said that TikTok consults with independent experts on these matters. Senator Tammy Baldwin (D-WI) has requested each firm to detail the actions it is taking to “delete information that promotes harmful body image and eating disorders and instead connect consumers to supportive services.” Baldwin’s inquiry focused specifically on how these firms are addressing these challenges among younger consumers.