Google has introduced a number of safety regulations, including the removal of targeted advertising and location monitoring for children.
Alphabet-owned Google and YouTube introduced new policies yesterday (10 August) aimed at limiting the digital footprints of their young users while also adding additional privacy measures.
Ads targeting users under the age of 18 based on their age, gender, or hobbies would be blocked under the new restrictions, which will be implemented over the next months. YouTube also said that it would start removing overtly commercial content from its YouTube Kids feature, such as videos that focus on product packaging or actively push kids to buy things.
All users under the age of 18 will have their Google location history removed. People under the age of 18 will no longer be able to switch it on, even if it is already turned off by default for all accounts.
Google will allow a parent or guardian of a child under the age of 18 to flag photos of the youngster for removal from Google Images. The following are some of the company’s highlights: “Of course, removing an image from search doesn’t remove it from the web, but we believe this change will help give young people more control of their images online.”
The SafeSearch feature will be enhanced as well. This filter is designed to block mature content and is already enabled by default for Google family link users under the age of 13. Any user under the age of 18 will have the filter enabled by default once these policies are implemented.
Any videos uploaded by users under the age of 18 will be immediately set to private on YouTube’s end. Users will be reminded about who might be able to see the video if this option is turned off.
Take a break, and bedtime reminders will be activated as a default setting. Autoplay will be blocked for these young users as part of its digital wellbeing campaign to prevent a youngster from being glued to their device. Again, these options can be turned on or off, but doing so may require parental permission.