The Children’s Online Privacy Protection Act (COPPA) is the cornerstone of protecting the data of children under the age of 13 in the United States. Since its enactment in 1998, it has aimed to set clear standards for how children’s data is collected, used, and shared by websites and digital services. With the massive digital transformations of the past decade, the Federal Trade Commission (FTC) has determined that it is time for substantial updates to this law, in order to keep pace with the challenges of the digital age and better protect new generations.

The law precisely defines the targeted age range, as all COPPA requirements apply to children who have not reached the age of thirteen. This law obligates all websites and digital services that target children or collect personal information from users known or reasonably expected to be under 13 to obtain verifiable parental consent before collecting, using, or sharing any personal data related to the child. The scope of protection also includes sites and services directed specifically at children, or those that actually know that some of their users are under 13, even if they are aimed at a general audience. General audience websites are not required to screen the ages of all users, but if they obtain “actual knowledge” that a user is under 13, all requirements of the law apply immediately, including notification and obtaining consent.

Historical Background and the Evolution of the Need for Change

Since the law was enacted in October 1998 and came into effect in April 2000, then was extensively amended in 2013, COPPA has remained the primary reference for regulating the collection and use of children’s data. However, recent years have witnessed rapid developments in artificial intelligence, the spread of smartphone applications and social networks aimed at children, as well as an increase in targeted advertising and data collection for commercial purposes. These changes prompted the FTC to launch a comprehensive review of the rule in 2019, which took years of discussion and gathered thousands of comments from technology companies, child rights organizations, and legal experts.

Motivations for the Update

The motivations for the update were not merely formal or technical, but came in response to increasing societal and legislative pressures. With every incident of data breach or commercial exploitation of children’s data, it became clear that the old laws were no longer sufficient. The main motivations focused on protecting children from unnecessary data collection, strengthening the role of parents in controlling their children’s information, keeping up with new technologies such as artificial intelligence, and responding to data breaches and the use of data for purposes that do not align with the best interests of the child.

Substantial Amendments to the COPPA Rule for 2025

Separate Parental Consent

Previously, the law required companies to obtain general parental consent for collecting, using, or sharing children’s data, and this consent often covered all purposes, including sharing data with third parties.

After the amendment, it is now necessary to obtain separate, verifiable parental consent specifically before sharing children’s data with third parties for non-essential purposes such as advertising or analytics. The use of the service can no longer be conditioned on parental consent for unnecessary third-party data sharing; instead, consent must be specific and clear for each type of sharing.

Procedural Detail:

The new rule requires companies to conduct a multi-step analysis to determine whether disclosing a child’s data to a third party requires separate parental consent. The company must first determine whether the disclosure falls under internal operations exceptions or is “integral” to providing the service. If not, separate, “freely given and specific” parental consent must be obtained, and the consent must be actual and clear, without using any misleading or deceptive methods in the consent process.

Expanded Notifications and Transparency

Previously, companies often sufficed with mentioning general “categories” of third parties in privacy policies or parental notices, without needing to name each party or clarify the purpose of sharing in detail.

After the amendment, the new law requires companies to explicitly name every third party with whom children’s data will be shared in notifications and privacy policies, along with clarifying the purpose of each sharing. Merely mentioning general “categories” is no longer acceptable; disclosure must be precise and transparent about each entity and the purpose of disclosure.

Procedural Detail:

Direct parental notifications must now include a list of every third party with whom the child’s data will be shared, along with an explanation of the purpose of each sharing. Any sharing with a third party not mentioned in the notice is considered a violation of the rule, and the privacy policy must be updated with any change in the list of third parties. Any collection of persistent identifiers must also be clarified, even if it falls under exceptions, and companies must explain how they prevent the use of these identifiers to contact the child.

Strengthening Data Security

Previously, the law required companies to take “reasonable security measures” to protect children’s data, without specifying detailed technical or administrative requirements.

After the amendment, the law obligates companies to implement written security programs, including periodic risk assessments, appointing data security officers, and updating security policies annually or as needed. Security measures must be appropriate to the sensitivity of the data and the size of the company, with a mandatory requirement to report any breaches or security incidents.

Procedural Detail:

Companies are now required to establish a written security program specifically for children’s information, reflecting the sensitivity of the data and the size of the company. An employee must be appointed to oversee the program, conduct annual risk assessments, test and monitor the effectiveness of security controls, and update the program annually or as necessary. Companies must also ensure that any third party with whom data is shared is capable of protecting the confidentiality and security of the information, and obtain contractual assurances to that effect.

Strict limits on data retention

Previously, the law required companies to delete children’s data after the purpose of collection had ended, but there were no clear requirements regarding the retention period or the existence of detailed written policies.

After the amendment, it became mandatory to delete children’s data immediately after the purpose of collection has ended, with a ban on indefinite retention of data for any reason, and the need for written policies that clearly specify the purposes of collection and precise deletion periods.

Procedural detail: The new amendment prohibits retaining children’s data indefinitely. There must be public retention and deletion policies, and data must be deleted as soon as the purpose of collection ends. This prohibition sparked debate within the FTC about its strictness, but it was emphasized that the goal is to prevent the use of children’s data in the development of artificial intelligence or unnecessary retention, while ensuring that this does not inconvenience adult users who wish to keep their digital memories.

Enhancing Transparency and Oversight in Safe Harbor Programs

What are Safe Harbor Programs?

Safe Harbor programs, in the context of the Children’s Online Privacy Protection Act (COPPA), are regulatory mechanisms that allow organizations or entities specializing in children’s online privacy to develop self-regulatory guidelines for compliance with the law, provided these guidelines receive formal approval from the U.S. Federal Trade Commission (FTC). If companies adhere to these approved guidelines within a Safe Harbor program, they are considered compliant with COPPA and are subject to oversight by the program operator rather than direct supervision by the FTC. These programs aim to provide an effective self-regulatory framework while maintaining the FTC’s right to monitor performance and revoke approval if necessary.

Previously, FTC-approved Safe Harbor programs had considerable flexibility in disclosing their members and policies, with no strict requirements for public posting or detailed reporting.

After the amendment, the law now requires these programs to publicly post member lists within 90 days of official publication and to provide detailed annual reports on complaints and violations, along with comprehensive reviews of members’ security and privacy policies. Safe Harbor programs are also required to conduct periodic security audits and disclose any disciplinary actions against members in annual reports submitted to the FTC, raising the level of transparency and accountability in these programs.

Additional detail:

Now, Safe Harbor programs must conduct comprehensive reviews of members’ privacy and security policies, submit all received complaints to the FTC on a regular basis, and publish a list of all members and certified products on the program’s website. They are not required to report minor technical issues that were quickly resolved, but must disclose any formal disciplinary actions. If a company fails to promptly address a violation or refuses to comply, its Safe Harbor certification can be revoked and the FTC notified for formal action.

In addition to the amendments, there are new requirements for websites that are partially directed to children.

A new definition for websites that are partially directed to children: Previously, there were no clear requirements for websites or services that are partially directed to children. After the amendment, these websites are now required to verify age in a neutral manner before collecting any personal data, with limited exceptions. Companies may choose innovative methods for age verification and are not required to rely solely on users’ self-declaration.

New and advanced verification methods: In the previous version, age eligibility verification methods were limited and often traditional. After the amendment, the law allows for new verification methods such as knowledge-based authentication, government ID verification with facial recognition technology, or advanced textual methods, with an emphasis on the flexibility for companies to choose the most appropriate mechanism.

Practical and regulatory challenges and expected impacts

Although these amendments are intended to protect children, they impose significant operational and regulatory challenges on companies. On one hand, companies must continuously update their privacy policies and obtain separate, verifiable parental consents when adding any new third party, which increases administrative and technical complexity. Small or emerging companies face the same difficulties in implementing security verification requirements, which may increase operational costs.

These amendments are expected to prompt major companies such as Google, Meta, and TikTok to review their policies and services directed at children, develop more stringent systems for obtaining separate and verifiable parental consents, and update their data retention and third-party management policies.

As for parents, they will have greater control and clearer transparency regarding how their children’s data is used, and children will be less exposed to targeted advertising and unnecessary data collection. On a global level, these amendments may encourage some other countries to adopt similar legislation, especially with international companies targeting the US market.

This means that COPPA amendments require any global company providing services or applications to children in the United States to fully comply with the law’s requirements, regardless of the company’s place of incorporation or headquarters. Therefore, the impact of these amendments extends beyond the United States and affects the policies and regulatory procedures of international companies targeting the US market.

Implementation Timeline

  • April 22, 2025: Publication of the amendments in the Federal Register.
  • June 23, 2025: Amendments take effect (60 days after official publication).
  • April 22, 2026: End of the full compliance period for most companies.
  • Safe Harbor Programs: Member lists must be published within 90 days, and updated guidelines and reports must be submitted within 6 months of official publication.

With the amendments approaching their effective date, companies must immediately begin reviewing their policies and internal systems to ensure full compliance with the new requirements and avoid regulatory risks.

Points of Convergence Between COPPA and the Protection of Children’s Data and Their Equivalents in the Personal Data Protection Law (PDPL)

There are fundamental points of convergence between the amendments to the American COPPA and the policy for protecting children’s data in the Kingdom of Saudi Arabia, most notably:

Parental Consent: Both systems require obtaining verifiable consent from a parent or legal guardian before collecting or processing a child’s data. Under the Personal Data Protection Law (PDPL), this obligation extends to anyone under 18 years of age, while COPPA applies only to those under 13.

Parental Notification and Control Options: Both systems require notifying the parent or guardian of the purpose of data collection and clarifying how the data will be used, while providing clear options for controlling the processing of the child’s data.

Purpose Limitation and Data Minimization: The collection of a child’s data in both systems is limited to the minimum necessary to achieve the legitimate purpose, and it is not permitted to collect additional data that does not serve this purpose.

Right of Access, Correction, and Deletion: Both the Saudi system and COPPA grant parents the right to access their child’s data and request its correction or deletion, with a clear mechanism provided for exercising these rights.

Withdrawal of Consent: Both systems allow parents to withdraw consent for the processing of the child’s data at any time, with entities required to delete the data once the purpose has ended or consent has been withdrawn.

Differences between the Saudi Personal Data Protection Law (PDPL) and the U.S. COPPA in the field of children’s data protection

Age scope of protection

In the Saudi Personal Data Protection Law, there is no explicit provision in the main law that defines the age of a child, but the law refers to related Saudi regulations such as the Child Protection Law, which defines a child as anyone who has not reached 18 years of age. This means that protection in the Kingdom extends to all minors up to the age of 18. In contrast, the scope of protection in the U.S. COPPA is limited to children under the age of 13 only, as all obligations and regulatory requirements apply exclusively to this age group.

Legal guardian consent

The Saudi Personal Data Protection Law requires obtaining the consent of the data subject or their legal representative, i.e., the legal guardian, when processing the data of a person lacking or of diminished legal capacity, which includes minors according to relevant Saudi regulations. This condition ensures that any processing of a child’s data does not take place except after the entity has verified the legal guardian’s consent. In COPPA, verifiable parental consent is required only for children under 13 years of age, before collecting, using, or disclosing their personal information for any purpose.

Transparency and notification

The Saudi Personal Data Protection Law obliges entities to notify the data subject or their guardian of the purpose of data collection, clarify how it will be used, and provide a means to access, correct, or destroy the data. This notification must be clear and easy to understand, enabling the guardian or data subject to effectively exercise their rights. In contrast, COPPA imposes expanded transparency and notification requirements for parents, requiring companies to explicitly name all third parties and the purpose of data sharing after the recent updates, which gives parents greater control over their children’s data.

Purpose limitation and legitimate interest

The Saudi system requires that data collection be only for a legitimate purpose, and that data not be retained after the need for it has ended, enabling the data subject or guardian to request destruction at any time. This ensures that a child’s data is used only for their legitimate interest and is not stored unnecessarily. In COPPA, data collection is restricted to the authorized purpose only, with companies required to delete data after the purpose has ended, and both systems require written retention and deletion policies to ensure that children’s data is not retained unjustifiably or for long periods.

Conclusion:

This convergence in regulations reflects a growing global awareness that a child is not just an ordinary internet user, but a holder of fundamental rights to digital safety and privacy. As the digital environment continues to evolve, the importance of these regulations emerges in building a safer and more conscious future for children, and in strengthening society’s trust in a digital world governed by responsibility and humanity. Whether in the Kingdom of Saudi Arabia or in the United States, protecting children’s privacy is no longer merely a regulatory option, but has become a fundamental commitment at the core of contemporary digital regulations.

Privacy Professionals
Joined February 24, 2026
Posts 12

Would you like to share your thoughts?

Your email address will not be published. Required fields are marked *