Weeks after Instagram rolled out increased protections for minors using its app, Google is now doing the same for its suite of services, including Google search, YouTube, YouTube Kids, Google Assistant, and others. The company this morning announced a series of product and policy changes that will allow younger people to stay more private and protected online and others that will limit ad targeting.
The changes in Google’s case are even more expansive than those Instagram announced, as they span across an array of Google’s products, instead of being limited to a single app.
Though Congress has been pressing Google and other tech companies on the negative impacts their services may have on children, not all changes being made are being required by law, Google says.
“While some of these updates directly address upcoming regulations, we’ve gone beyond what’s required by law to protect teens on Google and YouTube,” a Google spokesperson told TechCrunch. “Many of these changes also extend beyond any single current or upcoming regulation. We’re looking at ways to develop consistent product experiences and user controls for kids and teens globally,” they added.
In other words, Google is building in some changes based on where it believes the industry is going, rather than where it is right now.
On YouTube, Google says it will “gradually” start adjusting the default upload setting to the most private option for users ages 13 to 17, which will limit the visibility of videos only to the the users and those they directly share with, not the wider public. These younger teen users won’t be prevented from changing the setting back to “public,” necessarily, but they will now have to make an explicit and intentional choice when doing so. YouTube will then provide reminders indicating who can see their video, the company notes.
YouTube will also turn on its “take a break” and bedtime reminders by default for all users ages 13 to 17 and will turn off autoplay. Again, these changes are related to the default settings — users can disable the digital well-being features if they choose.
On YouTube’s platform for younger children, YouTube Kids, the company will also add an autoplay option, which is turned off autoplay by default so parents will have to decide whether or not they want to use autoplay with their children. The change puts the choice directly in parents’ hands, after complaints from child safety advocates and some members of Congress suggested such an algorithmic feature was problematic. Later, parents will also be able to “lock” their default selection.
YouTube will also remove “overly commercial content” from YouTube Kid, in a move that also follows increased pressure from consumer advocacy groups and childhood experts, who have long since argued that YouTube encourages kids to spend money (or rather, beg their parents to do so.) How YouTube will draw the line between acceptable and “overly commercial” content is less clear, but the company says it will, for example, remove videos that focus on product packaging — like the popular “unboxing” videos. This could impact some of YouTube’s larger creators of videos for kids, like multi-millionaire Ryan’s Toy Review.
Elsewhere on Google, other changes impacting minors will also begin rolling out.
In the weeks ahead, Google will introduce a new policy that will allow anyone under the age of 18, or a parent or guardian, to request the removal of their images from Google Image search results. This expands upon the existing “right to be forgotten” privacy policies already live in the E.U., but will introduce new products and controls for both kids and teenagers globally.
The company will make a number of adjustments to user accounts for people under the age of 18, as well.
In addition to the changes to YouTube, Google will restrict access to adult content by enabling its SafeSearch filtering technology by default to all users under 13 managed by its Google Family Link service. It will also enable SafeSearch for all users under 18 and make this the new default for teens who set up new accounts. Google Assistant will enable SafeSearch protections by default on shared devices, like smart screens and their web browsers. In school settings where Google Workspace for Education is used, SafeSearch will be the default and switching to Guest Mode and Incognito Mode web browsing will be turned off by default, too, as was recently announced.
Meanwhile, location history is already off by default on all Google accounts, but children with supervised accounts now won’t be able to enable it. This change will be extended to all users under 18 globally, meaning location can’t be enabled at all under the children are legal adults.
On Google Play, the company will launch a new section that will inform parents about which apps follow its Families policies, and app developers will have to disclose how their apps collect and use data. These features — which were partially inspired by Apple’s App Store Privacy Labels — had already been detailed for Android developers before today.
Google unveils its proposed ‘safety section’ for apps on Google Play
Google’s parental control tools are also being expanded. Parents and guardians who are Family Link users will gain new abilities to filter and block news, podcasts, and access to webpages on Assistant-enabled smart devices.
For advertisers, there are significant changes in store, too.
Google says it will expand safeguards to prevent age-sensitive ad categories from being shown to teens and it will block ad targeting based on factors like age, gender, or interests for users under 18. While somewhat similar to the advertising changes Instagram introduced, as ads will no longer leverage “interests” data for targeting young teens and kids, Instagram was still allowing targeting by age and gender. Google will not. The advertising changes will roll out globally in the “coming months,” the company says.
All the changes across Google and YouTube will roll out globally.