Children’s Privacy in 2026: From Australia’s Under-16 Social Media Ban to a Shift Beyond Notice-and-Consent in the United States

Recent developments in children’s privacy and online safety regulation reflect a global shift away from notice-and-consent frameworks toward access restrictions, design mandates, categorical advertising prohibitions, and ecosystem-level age-assurance mechanisms. Using Australia’s under-16 social media ban as a case study, this article examines four converging regulatory trends emerging across the United States, Europe, and the United Kingdom. These developments increasingly affect product design, advertising, and data governance decisions for companies operating consumer-facing digital services.

Overview

Children’s privacy and online safety in Australia recently transitioned from policy debate to operational reality in a very short time frame. Since December 10, 2025, millions of accounts believed to belong to under‑16 Australian users have reportedly been mass-deleted or restricted across major platforms to comply with the Online Safety Amendment (Social Media Minimum Age) Act 2024 (amending Australia’s Online Safety Act 2021 – collectively, the Act). The Act establishes a minimum age of 16 for the use of certain social media platforms and requires providers of “age‑restricted” social media platforms to take “reasonable steps” to prevent under‑16 users from creating or keeping accounts.

The broader significance of Australia’s legislative efforts, however, extends beyond any single platform’s response. It reflects a growing willingness by some regulators to place compliance pressure not only on individual services, but also on the broader ecosystem of platforms and intermediaries by imposing bright-line age rules and design choices that shape how children engage online. Importantly, this shift is not limited to Australia. Parallel developments in the United States, together with accelerating activity in Europe and the UK, signal a global adjustment toward more prescriptive and operationally demanding frameworks for children’s privacy and online safety.

Against that backdrop, the developments discussed below illustrate how children’s privacy regulation is converging around four themes: (i) efforts to limit minors’ access to certain digital services (including social media and AI-driven products); (ii) the spread of age-appropriate design mandates; (iii) increasing restrictions on digital advertising involving known minors; and (iv) the emergence of app-store-level age verification and signaling regimes. Taken together, these developments have immediate and practical implications for companies operating across jurisdictions.

Why This Matters Now

How Lawmakers Are Trying to Restrict Minors’ Access to Digital Services

Australia’s under-16 social media restrictions are emblematic of a broader global effort to limit or condition minors’ access to certain digital services outright – particularly those viewed as addictive or harmful – signaling a movement away from notice-and-consent models.

In the United States, state lawmakers have followed suit and enacted several access-based restrictions aimed at protecting minors, including by restricting minors’ access to social media platforms, algorithmically driven feeds and, increasingly, AI-powered chatbots and recommendation systems. However, these efforts have by and large been stalled by First Amendment challenges – nearly all are enjoined pending the outcome of litigation (e.g., Arkansas’ Social Media Safety Act and Utah’s Social Media Regulation Act). Nevertheless, the proliferation of laws attempting to shut down access to these platforms and technologies reflects a growing desire to move away from the notice-and-consent paradigm, particularly when it comes to minors’ data and online experiences. Whether this effort will be able to withstand constitutional challenges remains to be seen.

How Age-Appropriate Design Laws Are Replacing Notice-and-Consent Models

Another trend steering away from the traditional notice-and-consent paradigm is the passage of age-appropriate design requirements in several states. Such legislation mandates changes in online experiences and design based on age. California’s Age-Appropriate Design Code Act (AADC) was the first such law enacted in the U.S. although, like laws that attempt to restrict minors’ access to platforms, enforcement of the law has been tied up in litigation since its passage in 2022.

Despite the challenge to California’s law other states, including Maryland, Nebraska, South Carolina, and Vermont, have enacted or proposed similar laws. These laws generally require covered online services to consider the “best interests” of child users, implement heightened default privacy settings, assess and mitigate risks arising from product design, and limit certain data uses regardless of consent. Nebraska’s law is the only one currently in effect and not subject to a constitutional challenge, however, it is not broadly applicable to all online services accessible in Nebraska. Rather, the Nebraska Age-Appropriate Design Code Act applies more narrowly to certain covered online services and emphasizes child-focused risk assessments and default privacy protections, without extending the full range of design and data-use restrictions contemplated by other state laws. South Carolina’s recently enacted Age-Appropriate Design Code Act is now the subject of a constitutional challenge alleging that the law imposes content-based restrictions on speech and is preempted by federal law.

U.S. efforts to mandate “age appropriate” design standards for minors have parallels in the UK which, several years ago, published its Age Appropriate Design Code (Code) to guide business compliance with the General Data Protection Regulation (GDPR) as it relates to minors. Indeed, members of the UK Information Commissioner’s Office (ICO) responsible for consulting on the Code worked closely with California legislators to draft that state’s design code law. Similar design-based restrictions for minors are also found in the EU Digital Services Act.

U.S. State Laws Prohibiting Digital Advertising to Known Minors

A third flank of legislative efforts to address privacy concerns for minors is the trend to restrict or, in some cases, prohibit the use of digital advertising technologies on devices used by known minors under 17 or 18 years of age. In the last 12 months, several states have amended their comprehensive privacy laws to impose categorical restrictions on data practices involving known minors, including teenagers as old as 16 or 17.

One of the most restrictive efforts in this regard is Oregon’s new prohibition on the sale of personal information of known minors. Importantly, consent is not an option. As of January 1, 2026, businesses subject to the Oregon Consumer Privacy Act can no longer use known minors’ (under 16) personal data for targeted advertising. This could lead to a variety of operational impacts, including suppression of tracking technologies in certain contexts, or the restriction of certain youth-oriented audience segments for targeted advertising. Other states, including Connecticut and New York, have adopted or proposed similar restrictions on targeted advertising, profiling, and the use of tracking technologies where a business has knowledge that a user is a minor. Rather than imposing outright prohibitions, these laws generally require opt-in consent for certain data practices involving known minors, including targeted advertising and profiling. While opt-in consent requirements for minors’ data are not new – and have long existed under laws such as the California Consumer Privacy Act (CCPA) – the practical impact of these provisions has historically been limited by the lack of reliable age information. As age-assurance and age-signal mechanisms become more prevalent and reliable, these consent-based restrictions are likely to have far greater operational significance.

App-Store-Level Age Verification and Age Signals as Compliance Triggers

It may be a tautology, but the age-based restrictions in these laws will not achieve their policy aims if companies do not know the age of their users. At the same time, there are privacy and security risks connected with many companies processing large volumes of minors’ identifications, “selfies,” or other potentially sensitive personal data for age verification purposes. To address this tension, Texas, Utah, California, and Louisiana have passed laws that make app store operators the responsible party for age-assurance. At a high level, these laws require app stores to verify users’ ages or obtain parental consent and then communicate age-related signals to downstream developers.

Age-assurance requirements are also beginning to take shape through state-level implementation efforts. In California, the Attorney General has initiated rulemaking under the Protecting Our Kids from Social Media Addiction Act, including with respect to age-assurance and parental-consent requirements applicable to covered platforms. The rulemaking process reflects an effort to operationalize statutory obligations related to minors’ access to online services and highlights how age-assurance concepts may be translated into concrete compliance expectations.

As with other efforts to protect minors’ privacy rights, this area is the subject of active litigation and, as such, remains highly unsettled. Texas’s app store law was enjoined on First Amendment grounds just days before it was scheduled to take effect on January 1, 2026. App stores had already prepared to comply with the Texas law, including by providing app developers with detailed instructions about how to receive and react to age signals. Those efforts are now on hold pending the outcome of the constitutional challenge in the U.S. District Court for the Western District of Texas. Even more recently, a complaint was filed on February 5, 2026 in the U.S. District Court for the District of Utah challenging Utah’s App Store Accountability Act on similar constitutional grounds. When laws in other states come into effect, it is likely that they will also face similar legal challenges.

Concern about age-assurance also is top of mind for the U.S. Federal Trade Commission (FTC), which recently hosted Protecting America’s Children: A Workshop to Explore Age Verification Technologies, and appears poised to support age-assurance mandates. At the January 28, 2026 event, FTC Chair Andrew Ferguson previewed that the FTC may support an amendment to the Children’s Online Privacy Protection Act (COPPA) and issue a potential policy statement on age-verification technologies, stating that the “flourishing of our nation’s children depends on the privacy of their personal data and on the capacity of parents to control who has access to their child’s data and how those data are used.”

Against this backdrop, it seems likely that age-assurance will become a feature of the online ecosphere in the not-too-distant future. Indeed, several digital properties and platforms have already begun to voluntarily implement age-verification and age-assurance measures. The implications could be substantial: once a company receives an age signal indicating that a user is a minor, a host of state law restrictions on data use, advertising, and design may be triggered.

Conclusion

While it is unlikely the U.S. will soon see government-mandated mass deletion of teens’ social media accounts, as recently occurred in Australia, U.S. companies can expect continued efforts to innovate for online safety, legislate limits on the collection and use of minors’ personal data, and restrict, to some extent, minors’ access to certain online platforms. For consumer-facing businesses, this could require substantial investment in privacy technology and platform redesign, and limit the ability to engage in targeted teen outreach.

In response to the evolving landscape, companies may want to consider taking the following practical steps:

  1. Map minors’ access and exposure across products and features. Assess where and how minors may access their services, including through social media features, AI-driven tools, recommendation systems, and algorithmic feeds. Consider looking beyond whether a service is “child-directed” and instead focus on where minors may reasonably interact with specific functionalities that are increasingly subject to access restrictions or design mandates.
  2. Evaluate product design and defaults through an age-appropriate design lens. Review default settings, engagement mechanisms, and feature configurations to determine whether they appropriately account for minors’ best interests. This includes evaluating whether certain features should be limited, disabled, or redesigned for younger users, even where consent might otherwise be available.
  3. Reassess digital advertising and tracking practices involving minors. Review targeted advertising strategies involving teens and consider marketing plans that do not rely upon the collection or use of teens’ personal data. Other states may follow Oregon’s ban on targeted advertising to minors, while existing laws in other states that require opt-in consent from known teens will apply more broadly once companies have wider “knowledge” of the age of their customers – developments that could have an outsized impact on the ability to advertise to teens and other minors.
  4. Prepare for the downstream effects of age-signal and app-store-level verification regimes. Even though several app store age-verification laws are currently enjoined or subject to anticipated litigation, companies should plan for the operational consequences if age signals are eventually provided by app stores or operating systems. Receipt of an age signal may trigger overlapping obligations under multiple state laws, affecting data use, advertising, feature availability, and contractual arrangements with third parties.
  5. Monitor litigation and implementation time lines, not just enacted statutes. As noted, many of the laws discussed above remain entangled in constitutional challenges or are subject to delayed effective dates. Nonetheless, enforcement risk may materialize quickly once litigation is resolved. Companies should track not only enacted laws, but also key litigation developments and regulatory guidance that may signal when compliance obligations are likely to take effect.

Managing Associate Philip Robbins and Associate Brad Carney also contributed to this article.

This post is as of the posting date stated above. Sidley Austin LLP assumes no duty to update this post or post about any subsequent developments having a bearing on this post.