Google announced a slew of online safety measures for kids, including a private setting for teen-uploaded videos and a safeguard for ads shown to users under the age of 18. The new features, which come amid increased concerns about online child exploitation and safety during the global pandemic, affect Google’s YouTube video platform as well as its online services such as search and Google Assistant.
“As children and teenagers spend more time online, parents, educators, child safety and privacy experts, and policymakers are rightly concerned about how to keep them safe,” said Mindy Brooks, Google product and user experience director. “We interact with these groups on a regular basis and share their concerns.”
Google’s “safe search,” which excludes sensitive or mature content, will be the default setting for users under the age of 18, which was previously only the case for users under the age of 13. According to the tech giant, content from 13 to 17-year-olds will be private by default on the massively popular YouTube platform.
The new features, which come amid heightened concerns about online child exploitation and safety at a time of growing internet usage during the global pandemic, affect Google’s YouTube video platform as well its online services such as search and Google Assistant.
“With private uploads, content can only be seen by the user and whomever they choose,” wrote James Beser, YouTube Kids and Family’s head of product management, in a blog post. “We want to assist younger users in making informed decisions about their digital footprint and privacy… If the user wants to make their content public, they can change the default upload visibility setting, and we’ll send them reminders about who can see it.”
Google will also make it easier for families to request that images of their children be removed from image search requests. “Of course, removing an image from search does not remove it from the web,” Brooks explained, “but we believe this change will help give young people more control over their images online.”
The new policies, which will be implemented over the next few months, will prohibit ad targeting on Google for users under the age of 18 based on their age, gender, or interests. YouTube will start removing excessively commercial content from its YouTube Kids feature, such as videos that focus on product packaging or directly encourage children to buy products.
Google’s location history for all users under the age of 18 is also being scraped. While it is already turned off by default for all accounts, people under the age of 18 will no longer be able to enable it.
Google will allow a parent or guardian of a child under the age of 18, as well as the child themselves, to flag photos of the child on Google Images for removal. “Of course, removing an image from search does not remove it from the web,” the company stated in a blog post, “but we believe this change will help give young people more control over their images online.”
YouTube does not permit content that endangers minors’ emotional and physical well-being. A minor is someone who is under the legal age of majority, which is usually anyone under the age of 18 in most countries/regions. If you come across any content that violates this policy, please report it. If you believe a child is in danger, contact your local law enforcement immediately and report the situation.
Google will turn off location history for all users under the age of 18 globally, with no option to turn it back on. For those under the age of 13, this is already in place. Google will also make changes to how it displays ads to minors, blocking any “age-sensitive” categories and prohibiting targeting based on people under the age of 18, gender, or interests.
As the default setting, take a break and bedtime reminders will be enabled. Stopping a child from becoming engrossed in their screen is part of the platform’s digital wellbeing effort, so autoplay will be disabled for these young users as well. Again, these settings can be toggled on or off, but doing so may necessitate parental consent.
Alphabet’s decision to prioritize the safety and privacy of its young users comes amid a storm of controversy in the area of child safety, with Apple recently announcing controversial new technology aimed at targeting child sexual abuse material.