On September 2, 2022, the California Age-Appropriate Design Code Act (the “Act”) (effective July 1, 2024) was passed by the California legislature, and on September 15, 2022 was signed into law by Governor Newsom. This Act dramatically expands business obligations and will force entities that provide an online service, product, or feature that is “likely to be accessed by children” (“Product”) to implement stringent privacy settings for users under 18. It aligns in many respects with the United Kingdom’s Age Appropriate Design Code, which passed in 2020. Together, these laws represent a significant shift in the regulatory landscape of children’s digital services.
The overarching policy of the Act is to require such entities to prioritize the best interests of children when developing and implementing their services. The Act implements this policy through a number of stringent requirements, including using language in privacy notices that is age-appropriate, undertaking physical and mental well-being impact assessments for existing and new products and services, and implementing stringent requirements on such entities use of the data as a default.
The Act differs from the current existing federal framework under the Children’s Online Privacy Protection Rule of 1998 (“COPPA”) in two main respects—one, it applies to all children under the age of 18 and two, it covers entities that that have services “likely to be accessed by children.” COPPA, on the other hand, only applies to children under the age of 13 and only applies to businesses that have products and services directed at children. While COPPA also applies to websites that have actual knowledge that they are collecting a child’s personal information, the Act focuses on acquiring parental consent for the collection. Meanwhile, the Act heightens overall privacy protections for children by default and adds affirmative obligations for businesses.
The “likely to be accessed by children” standard would include online products and services that children regularly visit that would otherwise not be covered under COPPA, like social media and sites for video calling or online messaging that may not be technically “directed” at children.
Fortunately, the latest amendments to the bill clarified the factors to consider for a Product to be “likely to be accessed by children”: whether the Product is directed to children (as defined in COPPA); whether the Product is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children; whether the Product advertises to children; whether the Product based on competent and reliable evidence regarding audience composition is substantially similar to one that is routinely accessed by children ; and whether it has design elements that are known to be of interest to children, such as online games or cartoons. Unlike COPPA, there is no “actual” knowledge standard to determine whether other services or platforms may fall under the scope of the law.
The Act creates a set of affirmative obligations as well as lists some prohibited actions for businesses.
A business, before offering any new online Product that is likely be accessed by children, must undertake a Data Protection Impact Assessment (“DPIA”) prior to making the product available. Such a report is a systematic survey to assess and mitigate risks to children, such as physical and mental health, and must be provided to the agency within twelve months of the Act’s enactment and reviewed every two years or before any new features are offered.
The specific requirements of this assessment will be to assess:
- Whether the design of the Product could harm children, including by exposing children to harmful, or potentially harmful, content on the online product, service, or feature.
- Whether the design of the Product could lead to children experiencing or being targeted by harmful, or potentially harmful, contacts on the online product, service, or feature.
- Whether the design of the Product could permit children to witness, participate in, or be subject to harmful, or potentially harmful, conduct on the online product, service, or feature.
- Whether the design of the Product could allow children to be party to or exploited by a harmful, or potentially harmful, contact on the online product, service, or feature.
- Whether algorithms used by the Product could harm children.
- Whether targeted advertising systems used by the Product could harm children.
- Whether and how the Product uses system design features to increase, sustain, or extend use of the online product, service, or feature by children, including the automatic playing of media, rewards for time spent, and notifications.
- Whether, how, and for what purpose the Product collects or processes sensitive personal information of children.
The Act also imposes other obligations and prohibitions regarding data privacy:
- Document any risk for the DPIA and create a timed plan to mitigate or eliminate the risk before the Product is accessed by children;
- Upon written request by the Attorney General, the business is to provide a list of all the DPIA reviews the business has completed;
- Configure default settings to a high level of privacy protection (without specifying what “high level” means) unless the business can demonstrate a compelling reason that a different setting is in the best interests of children;
- Provide any privacy information, terms of service, policies, and community standards using clear, age-appropriate language;
- If the service allows someone, including a guardian to monitor the child, provide an obvious signal to the child when they are being monitored or tracked;
- Enforce business policies and standards and provide adequate tools to help users exercise their privacy rights and report concerns.
- Cannot use the personal information of a child for any reason other than the reason or reasons for which the personal information was collected;
- Cannot use the personal information of a child that may harm or the business knows can be materially detrimental to the physical health, mental health or well-being of a child;
- Cannot profile a child by default, unless business can 1) demonstrate it has appropriate safeguards in place to protect the children and 2) either (a) show that profiling is necessary to provide the Product requested, and then only with respect to the aspects of the Product with which the child is actively and knowingly engaged, or (b) demonstrate a compelling reason that profiling is in the best interests of children;
- Cannot collect, sell share or retain any personal information that is not necessary to provide the Product with which the child is actively and knowingly engaged with unless the business can demonstrate a compelling reason that the collecting, selling, sharing, or retaining of the personal information is in the best interests of children likely to access the Product.
- Cannot collect, sell or share any precise geolocation information of a child by default unless it is necessary for the business to provide the product or service;
- Cannot collect the geolocation of a child without providing clear notice during the duration that the information is being collected;
- Cannot use dark patterns or “nudge” techniques to encourage children to forego privacy protections;
- Cannot use or retain information used to establish a user’s age longer than is necessary.
Any business that violates the Act can be liable for a civil penalty of not more than two thousand five hundred dollars ($2,500) per affected child for each negligent violation or not more than seven thousand five hundred dollars ($7,500) per affected child for each intentional violation. Any penalties and fees recovered will be then put towards a Consumer Privacy Fund.
However, if the business is in substantial compliance with its DPIA duties and other obligations, then the Attorney General shall provide a written notice of the violations. If within 90 days, the business cures the violation and can show mechanism that will prevent future violations, then the business shall not be liable for a civil penalty for any violation cured.
In line with the “California Effect,” this Act is expected to shift the legal landscape for children’s privacy protections and obligations for affected businesses across the US and perhaps beyond.