New York Maps Out Youth Online Safety Rules


Pew Research Center finds that 46% of U.S. teens say they are online almost constantly, and majorities use TikTok, Snapchat, and Instagram. Against that backdrop, New York’s attorney general has released draft rules that spell out how the state’s 2024 youth online safety measures will work. The proposal centers on age checks, limits on algorithmic recommendation feeds and late-night push alerts for minors, and a built-in parental permission system.

What the proposal would require

The rules implement the SAFE for Kids Act alongside the New York Child Data Protection Act, both enacted in June 2024. Services would need to determine a user’s age before offering algorithmic recommendation feeds or sending push notifications during nighttime hours. Companies can choose how to verify age, as long as the method can reliably distinguish minors from adults and safeguards privacy and security. The draft makes clear that government ID cannot be the sole or default mechanism.

If a minor wants an algorithmic feed or after-hours alerts, the platform must seek permission from a parent or guardian. If permission is not granted, the service cannot deny the young person general access to the app or its content. Either the minor or the parent can withdraw consent at any time, and the platform must honor that choice without delay. The attorney general’s office says the approach aims to give families meaningful control without forcing teenagers off services entirely.

What this means for your apps

For users identified as minors who do not have parental permission, chronological or otherwise non-curated feeds are likely to become the default. Nighttime push notifications would be switched off by default for those accounts, which could reduce sleep disruptions for teens who keep their phones nearby. Expect new prompts and dashboards that help parents and guardians approve or reject requests for recommendation feeds and after-hours alerts. The draft envisions a clear path to modify or revoke choices, which means services will need to build consent workflows that are easy to find and use.

Age checks will need to be accurate without becoming intrusive. The proposal encourages privacy-preserving approaches and discourages any solution that relies on a single, sensitive document. Platforms should be thinking about methods that minimize data collection and limit exposure of personal information, while still meeting a high bar for reliability. That may require partnerships with third-party age assurance providers, changes to onboarding, and new internal audits to document how systems work.

Why New York is doing this

State officials are linking the rules to wider concerns about youth mental health and the role of engagement-driven design. The U.S. Surgeon General has warned about potential harms when social media shapes sleep, attention, and body image. CDC survey data show that 42% of U.S. high school students reported persistent feelings of sadness or hopelessness in 2021, including 57% of teen girls. At the same time, social media is nearly universal in teens’ lives, with Pew reporting that YouTube reaches about 95% of U.S. teens and large shares using other major apps each week.

The attorney general’s framework attempts to thread a needle. It targets design features that keep kids engaged late at night or draw them deeper into feeds based on opaque signals, while preserving access to information, communities, and school resources. By anchoring decisions in consent, the state is signaling that family preferences should direct the use of high-engagement features for minors.

If adopted, platforms will have to implement compliant age confirmation, consent, and notification systems within whatever compliance timeline accompanies the final rules.

How New York fits into the state policy wave

New York joins a growing number of states testing limits on how social platforms interact with minors. Utah enacted a package in 2023 that requires age verification and parental consent for teen accounts, and it updated its framework in 2024 with rules slated to take effect in 2025. Florida passed a law in 2024 that restricts accounts for users under 14 and requires parental consent for ages 14 and 15. Arkansas and Ohio adopted similar parental consent measures that have been put on hold by federal courts during legal challenges. California’s Age-Appropriate Design Code was signed in 2022 but is currently enjoined pending litigation. Lawmakers in states such as Minnesota and New Jersey have introduced bills inspired by children’s design standards and teen privacy provisions.

New York’s approach is notable for preserving general access even when enhanced features are limited without parental approval. It also emphasizes flexible, privacy-preserving verification. Those choices could influence how other states calibrate proposals that focus on recommendation feeds and always-on notifications.