The Legal Battles Taking Shape in the Clash Over Internet Content

A federal law known as Section 230 has provided a powerful legal shield for internet companies for nearly three decades. Designed to “promote the internet,” it protects platforms from civil liability for content posted to their sites by third parties.

(more…)

Heightened Focus in the EU for the Protection of Minors Online

The protection of minors online continues to be a focus for EU regulators. Following the publication last year by the European Parliament of its guidelines on online age verification methods for children, the European Commission has recently announced it will be holding a dedicated stakeholder workshop in September 2024 to discuss guidelines for age verification and protecting minors. Whilst the issue has been flagged as a priority by the European Data Protection Board (“EDPB”) and we are seeing an increase in guidelines and (in some cases) laws addressing the issue at a national Member State level, this is also a focus of the new EU Digital Services Act (“DSA”).

(more…)

New Hampshire’s Comprehensive Data Privacy Legislation

As the state boasting the headquarters of the International Association of Privacy Professionals, many have been watching the development of the New Hampshire comprehensive consumer data privacy law with great interest, wondering if it may be a practical model for the nation. On March 6, 2024, Governor Chris Sununu signed SB 255-FN (“the Act”) into law. In some respects, New Hampshire’s privacy law is comparatively more moderate than some other state laws. For instance, the New Hampshire Secretary of State’s rulemaking authority under the Act is currently limited to establishing requirements for privacy notices. This narrow extension of rulemaking authority is a divergence from the broad rulemaking authority granted by California, Colorado, and other states. The New Hampshire law does not allow for a private right of action. There is a right to cure alleged violations through the first year the law is in force; afterwards, the opportunity to cure is left to the Attorney General’s discretion. The legislation will take effect on January 1, 2025.

(more…)

FTC Proposes Significant and Sweeping Changes to COPPA and Requests Public Comment

On January 11, 2024, the Federal Trade Commission (“FTC”) published its Notice of Proposed Rule Making (“NPRM”) seeking to update the FTC’s Children’s Online Privacy Protection Act (“COPPA”) Rule in the Federal Register.  Among other things, the proposed changes would require more granular privacy notices, require fairly detailed identification of, and parental consent to, third-party data sharing (including targeted advertising), expand the scope of personal information subject to COPPA, make it easier for parents to provide consent via text message, clarify various requirements around EdTech, including school authorization for parental consent, and impose significant new programmatic information security and data retention requirements.

(more…)

UK and Australian Governments Sign “world-first” Online Safety and Security Memorandum of Understanding

On 20 February 2024, the UK Government and the Australian Federal Government co-signed a historic Online Safety and Security Memorandum of Understanding (MoU) signifying the bilateral cooperation between the two countries to help boost their respective online safety regimes. Notably, this is the first arrangement of its kind, with the MoU intending to encompass a wide range of digital online safety and security issues. These include illegal content, child safety, age assurance, technology facilitated gender-based violence, and addressing harms caused by rapidly evolving technologies, such as generative artificial intelligence.

(more…)

UK’s OfCom to Publish Guidance on Illegal Content Risk Assessments in Light of Online Safety Bill

The UK’s Online Safety Bill (“Bill”), once legislated, will impose duties of care on providers of digital services, social media platforms and other online services to make them responsible for content generated and shared by their users and to mitigate the risk of harm arising from illegal content, and if services are deemed accessible by children, a duty to protect children from harm. As currently drafted, the Bill applies to any service or site that has users in the UK, or targets the UK as a market, even if it is not based in the country. The Bill is currently at the Committee Stage of the legislative process. Although the Bill is expected to receive Royal Assent during 2023, the timeline as to when the provisions will come into force is still unclear.

(more…)

The California Age-Appropriate Design Code Act Dramatically Expands Business Obligations

On September 2, 2022, the California Age-Appropriate Design Code Act (the “Act”) (effective July 1, 2024) was passed by the California legislature, and on September 15, 2022 was signed into law by Governor Newsom.  This Act dramatically expands business obligations and will force entities that provide an online service, product, or feature that is “likely to be accessed by children” (“Product”) to implement stringent privacy settings for users under 18. It aligns in many respects with the United Kingdom’s Age Appropriate Design Code, which passed in 2020. Together, these laws represent a significant shift in the regulatory landscape of children’s digital services.

The overarching policy of the Act is to require such entities to prioritize the best interests of children when developing and implementing their services.  The Act implements this policy through a number of stringent requirements, including using language in privacy notices that is age-appropriate, undertaking physical and mental well-being impact assessments for existing and new products and services, and implementing stringent requirements on such entities use of the data as a default.

(more…)