CHICAGO (NewsNation Now) — Social media platforms are ramping up protection for minors. This week, Google and YouTube announced it is introducing policy and product changes that will allow teens to stay more private online and others that will limit ad targeting.
“We’re committed to building products that are secure by default, private by design, and that put people in control,” Google said in a blog post. “As kids and teens spend more time online, parents, educators, child safety and privacy experts, and policymakers are rightly concerned about how to keep them safe. We engage with these groups regularly and share these concerns.”
Google says it will allow anyone under 18, or their parent or guardian, to request the removal of their images from Google Image search results. Other protections for young users include turning off location history and blocking ad targeting based on minors’ age, gender, or interests.
The tech giant will also automatically enable “SafeSearch” to filter out explicit results. In addition, applications will be required to disclose how they use data as part of a new safety section for Google Play, highlighting which apps follow Google’s family policies.
YouTube will remove “overly commercial” videos from YouTube Kids, which YouTube says could be content that focuses solely on product packaging or “directly encourages” kids to spend money. Further protections include “take a break” and default bedtime reminders for users 13-17. In addition, it is adding an autoplay option for YouTube Kids only.
The company is also adjusting the default upload setting to “the most private option” for users between 13 and 17, with private uploads only being seen by the user and their chosen audiences.
TikTok and Instagram recently implemented similar privacy protections for teens to safeguard their online experience.