Google’s legislative proposal for keeping kids safe online

Everyone wants to protect kids and teens online, and make sure they engage with age-appropriate content, but how it’s done matters. There are a variety of fast-moving legislative proposals being pushed by Meta and other companies in an effort to offload their own responsibilities to keep kids safe to app stores. These proposals introduce new risks to the privacy of minors, without actually addressing the harms that are inspiring lawmakers to act. Google is proposing a more comprehensive legislative framework that shares responsibility between app stores and developers, and protects children’s privacy and the decision rights of parents.

Where current legislative proposals fall short

One example of concerning legislation is Utah’s App Store Accountability Act. The bill requires app stores to share if a user is a kid or teenager with all app developers (effectively millions of individual companies) without parental consent or rules on how the information is used. That raises real privacy and safety risks, like the potential for bad actors to sell the data or use it for other nefarious purposes.

This level of data sharing isn’t necessary — a weather app doesn’t need to know if a user is a kid. By contrast, a social media app does need to make significant decisions about age-appropriate content and features. As written, however, the bill helps social media companies avoid that responsibility despite the fact that apps are just one of many ways that kids can access these platforms. And by requiring app stores to obtain parental consent for every single app download, it dictates how parents supervise their kids and potentially cuts teens off from digital services like educational or navigation apps.

A legislative framework that better protects kids

By contrast, we are focused on solutions that require appropriate user consent and minimize data exposure. Our legislative framework, which we’ll share with lawmakers as we continue to engage on this issue, has app stores securely provide industry standard age assurances only to developers who actually need them — and ensures that information is used responsibly. Here are more details:

  • Privacy-preserving age signal shared only with consent: Some legislation, including the Utah bill, require app stores to send age information to all developers without permission from the user or their parents. In our proposal, only developers who create apps that may be risky for minors would request industry standard age signals from app stores, and the information is only shared with permission from a user (or their parent). By just sharing with developers who need the information to deliver age-appropriate experiences, and only sharing the minimum amount of data needed to provide an age signal, it reduces the risk of sensitive information being shared broadly.
  • Appropriate safety measures within apps: Under our proposal, an age signal helps a developer understand whether a user is an adult or a minor — the developer is then responsible for applying the appropriate safety and privacy protections. For example, an app developer might filter out certain types of content, introduce take a break reminders, or offer different privacy settings when they know a user might be a minor. Because developers know their apps best, they are best positioned to determine when and where an age-gate might be beneficial to their users, and that may evolve over time, which is another reason why a one-size-fits-all approach won’t adequately protect kids.
  • Responsible use of age signals: Some legislative proposals create new child safety risks because they establish no guardrails against developers misusing an age signal. Our proposal helps to ensure that any age signals are used responsibly, with clear consequences for developers who violate users’ trust. For example, it protects against a developer improperly accessing or sharing the age signal.
  • No ads personalization to minors: Alongside any age assurance proposal, we support banning personalized advertisements targeting users under 18 as an industry standard. At Google, this is a practice we’ve long disallowed. It’s time for other companies to follow suit.
  • Centralized parental controls: Recognizing that parents sometimes feel overwhelmed by parental controls across different apps, our proposal would provide for a centralized dashboard for parents to manage their children’s online activities across different apps in one place and for developers to easily integrate with.

Google has demonstrated our commitment to doing our part to keep kids safe online. We’re ready to build on this work and will continue engaging with lawmakers and developers on how to move this legislative framework for age assurance forward.

Blog Article: Here

  • Related Posts

    The latest AI news we announced in March

    Here are Google’s latest AI updates from March 2025.

    Start building with Gemini 2.5 Pro.

    We’ve seen incredible developer enthusiasm and early adoption of Gemini 2.5 Pro, and we’ve been listening to your feedback. To make this powerful model available to more…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Copado and Agentforce: Automating Release Notes and Accelerating Production Deployments

    Copado and Agentforce: Automating Release Notes and Accelerating Production Deployments
    Celebrating Microsoft’s 50 years

    Vibe coding with GitHub Copilot: Agent mode and MCP support rolling out to all VS Code users

    Vibe coding with GitHub Copilot: Agent mode and MCP support rolling out to all VS Code users

    Say Hello to Your New Colleague, the AI Agent

    Say Hello to Your New Colleague, the AI Agent
    Your AI Companion

    The latest AI news we announced in March

    The latest AI news we announced in March