Skip to content
Risk Fraud & Compliance

New laws and regulations around child safety and privacy raise significant questions

Steve Wood  Special Adviser on Data Protection // A&O Shearman

· 7 minute read

Steve Wood  Special Adviser on Data Protection // A&O Shearman

· 7 minute read

There has been a significant shift towards regulating children's safety and privacy in the digital environment, and already some international bodies and governments are moving to ensure children's rights and safety online

Over the last five years, we have seen a major shift towards the regulation of children’s safety and privacy in the digital environment. The shift has been driven by increasing public concerns about the risks children face online and the growing realization that internet that was not designed with them in mind.

International governments and non-governmental bodies also have responded to this challenge. In 2021, the United Nations issued its General Comment 25 on Children’s Rights in relation to the Digital Environment, the adoption of which made explicit that children’s rights apply in the digital world as well as the real world. The rights covered by this action included the right to privacy, freedom of expression, and protection from commercial exploitation.

Also in 2021, the Organisation for Economic Co-operation and Development (OECD) produced its Recommendation on Children in Digital Environment, which included principles for a safe and beneficial digital environment for children, with the child’s best interests as a primary consideration. The OECD’s action also highlighted the need for risk-based and proportionate regulation, supported by measures that provide age-appropriate child safety by design.

As a range of legislation has been announced by the European Union, the United Kingdom, and the United States, it is clear that policymakers are moving away from industry self-regulation as a solution.

Europe and UK pass groundbreaking laws

The most significant new legislation being developed come from the EU and UK. In 2022, the EU passed the Digital Services Act (DSA), which includes a requirement that risk assessments be undertaken of the impacts on children’s rights online. The DSA also includes a prohibition on targeted advertising to children, and it imposes stronger obligations on the largest platforms and search services, including transparency requirements to publish risk assessments.

In the UK, the Online Safety Act (OSA) was passed in 2023, imposing duties of care on platforms that provide user-to-user services and search services. This includes risks assessment duties for services that are likely to be accessed by children. The OSA also goes further than the DSA in setting out the types of content that that platforms must prevent children from encountering online.


In 2021, the OECD produced its “Recommendation on Children in Digital Environment”, which included principles for a safe and beneficial digital environment for children, with the child’s best interests as a primary consideration.


Also in the UK, the government’s data protection regulator — the Information Commissioner’s Office (ICO) — has introduced the Age-Appropriate Design Code (AADC), which sets out 15 standards that online services likely to be accessed by children must follow to mitigate risks of harm related to data and privacy. This includes a requirement to conduct data protection impact assessments and implement by default protections. The ICO will apply the AADC when enforcing the UK General Data Protection Regulation.

A recent report, Impact of Regulation on Children’s Digital Lives, has tracked changes made by online platforms between the years 2017 and 2024, and it highlights 128 changes made Meta, Google, TikTok, and Snap. The report also discusses how these platforms have increasingly focused on safety by design changes, with clear links to the influence of the AADC, DSA and OSA, despite the relatively early stage of implementation.

US moves towards new legislation and regulation

Meanwhile, the US has had longstanding law in the form of the Children’s Online Privacy Protection Act (COPPA) at federal level, which has been in place since 1998. COPPA imposes certain requirements on operators of online services directed at children under 13 years of age, and on operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child under 13 years of age.

There is now a debate in the US about the limitations of the safeguards provided in COPPA, and the U.S. Congress is now debating the introduction of the Kids Online Safety Act (KOSA), which would create a duty of care for companies operating a platform. Social media platforms would also have to provide children with tools to protect their personal data, disable addictive product features, and opt out of personalized algorithmic recommendations.


There is now a debate in the US about the limitations of the safeguards provided [in previous laws], and the U.S. Congress is now debating the introduction of the Kids Online Safety Act, which would create a “duty of care” for companies operating a platform.


The U.S. Senate passed their bill on July 30 by a vote of 91–3, but it is still unclear whether it will pass in the U.S. House of Representatives and reach the statute book before the coming election in November. The law is somewhat controversial as some free speech advocates are concerned that the definition of harm spelled out in KOSA is too vague and could lead to censorship of information.

A number of US states have also passed their own individual legislation. In fact, a 2023 report from the University of North Carolina at Chapel Hill found that 13 states had passed a total of 23 laws regarding children’s online safety.

Not all these laws, however, have passed judicial muster. Most famously, the California Age-Appropriate Design Code Act (AADC), which was modeled on the UK approach and passed in 2022, was blocked by a federal court a year later on the grounds that its requirements for companies could violate the First Amendment rights of free speech. California Attorney General Rob Bonta filed to appeal the preliminary injunction with the U.S. Court of Appeal and awaits the final judgment. (Maryland has also passed its version of the Age-Appropriate Design Code in May 2024.)

Conclusion

With other countries, such as Canada and Brazil, also considering similar laws and regulations, there is clear direction towards regulating children’s safety and privacy online. This creates responsibilities for those companies that provide online services to assess the harms and risks to children, and develop solutions by design to mitigate these risks. Further, these new duties will require companies to develop child rights impacts assessments for their online services and for new features and design changes. Companies will also have to demonstrate how their design solutions effectively protect children in practice.

The new laws and regulations are also moving companies to use age-assurance technologies, to test and understand their effectiveness, and consider other consequences for privacy and the overall user experience.

In response to these new regulations, many companies are developing new governance methods, policies, and work processes to ensure the requirements of these new laws are embedded into the companies’ design and engineering practices. Providing evidence and documentation will also be crucial to demonstrate companies’ compliance and accountability to regulators.

Indeed, many of these regulations are still new, and companies also are seeking further guidance about what the requirements mean in practice. We also await the first case decisions from regulators which will illustrate how they are setting the bar in terms of unacceptable risks and practices among these online platforms.

Alongside these questions, the new focus on content risks — not just data protection and privacy risks — will also create significant challenges for platform operators in balancing new regulatory requirements with their users’ expected freedom of expression.