Which topic would you like to know more about?
What is the legal age of adulthood/ majority in your jurisdiction? Are all persons below this age considered a child/ minor?
There is no harmonised legal age of adulthood/ majority at EU level – the EU has not established a uniform age of majority across Member States, leaving this to national legislation. Generally speaking, however, most Member States set this age at 18 in line with Article 1 of the UNCRC.
Has the UNCRC been directly incorporated into national law in your jurisdiction?
No.
The UNCRC is not directly incorporated into EU law as the EU has not formally acceded to the UNCRC as a direct party. This is different from individual EU Member States, all of which have ratified the UNCRC. However, the EU does recognise the principles of the UNCRC through Article 3(3) of the Treaty on the European Union, which states that the EU must promote the “protection of the rights of the child.” The EU Charter of Fundamental Rights also explicitly recognises the rights of the child at Article 24, which draws directly on UNCRC principles. Many UNCRC principles have also been implemented through various EU regulations and directives including, amongst others, the GDPR, the Digital Services Act and the EU Artificial Intelligence Act (Regulation 2024/1689). Finally, the Court of Justice of the European Union has often referenced the UNCRC in its judgments, thus giving further effect to the UNCRC principles in EU case law. Thus, while the UNCRC has not been formally or fully incorporated into EU law as a single legal instrument, its principles and many of its provisions haven been integrated into the EU legal framework through a variety of mechanisms.
Is there an ombudsperson/ commissioner for children in your jurisdiction?
No.
There is no dedicated EU ombudsperson specifically for children at an EU institutional level. Note, however, that many individual EU Member States do have national ombudspersons for children (or similar institutions) at national level. While not an EU body in and of itself, the European Network of Ombudspersons for Children is a not-for-profit association of independent children’s rights institutions (ICRIs), which links those ICRI offices established in the Council of Europe in order to facilitate the promotion and protection of children’s rights as set out in the UNCRC.
If there is an ombudsperson/ commissioner for children in your jurisdiction, do they have any responsibility for upholding children’s rights in the digital world or does the relevant regulator have sole competence?
No.
N/A.
Is there any standalone requirement to collect the consent of one or both parents when processing a child’s personal data (irrespective of any other obligations, e.g. the requirement to have a legal basis for processing)?
No.
The requirement to collect parental consent under Article 8(1) of the GDPR only applies where the service provider is relying on consent as the legal basis under the GDPR for processing a child’s personal data. If another legal basis is relied on for processing, there is no requirement to collect parental consent for processing the child’s data. See further information here.
Entirely separate to the requirement to collect parental consent under Article 8(1) is that a digital service provider (controller) may decide to seek parental permissions for a child to access different settings/ features/ functionalities etc. as part of the measures it implements under Articles 24 and 25 of the GDPR to ensure a high level of protection for child users.
At what age can children legally consent to the processing of their own personal data, such that parental permission/ consent is not required?
In the online context, Article 8 of the GDPR requires that if consent is relied upon by an information society service, e.g. an online or digital service, as the relevant legal basis under Article 6 of the GDPR to process the personal data of a child, then the consent of the holder of parental responsibility (i.e., the parent / guardian of that child) must be obtained where the child is under the age of this so-called “digital consent” in the relevant EU Member State.
Article 8(1) of the GDPR provides that the age of digital consent may be anywhere between 13 and 16 years of age, depending on what age the EU Member State in question has adopted. Under Article 8(3) this does not affect the general contract law of the EU Member State in question on the validity, formation or effect of a contract in relation to a child.
Are there specific requirements in relation to collection and/or verification of parental consent/ permission concerning the processing of a child’s personal data?
Yes.
Where consent is relied upon as a controller’s legal basis under Article 6(1)(a) of the GDPR in respect of the processing of children’s data (see more information here), Article 8(2) requires that the controller must “make reasonable efforts” to verify that consent has been given or authorised by the parent / guardian, taking into consideration available technology. Per the European Data Protection Board’s Guidelines 05/2020 on consent under Regulation 2016/679 (Adopted on 4 May 2020), such measures must be proportionate to the nature and risks of the processing activities.
Are there any particular information or transparency requirements concerning the processing of children’s personal data?
Yes.
The Article 29 Working Party Guidelines on transparency under Regulation 2016/679 (and endorsed by the European Data Protection Board (Transparency Guidelines) specifically require controllers who either target children or who are or should be aware that their goods or services are being utilised by children to ensure that “the vocabulary, tone and style of the language used is appropriate to and resonates with children so that the child addressee of the information recognises that the message/ information is being directed at them.” The Transparency Guidelines further confirm that controllers are obligated to ensure that, where they “target children or are aware that their goods or services are particularly utilised by children of a literate age, that any information and communication should be conveyed in clear and plain language or in a medium that children can easily understand.” In practice, these obligations will require controllers to assess the types of measures which should be implemented in order to ensure that transparency information is particularly accessible to children which, per the Transparency Guidelines, may include the use of comics/ cartoons, pictograms, animations etc.
Can children directly exercise their rights in relation to their personal data without the involvement of their parents?
Yes.
At EU level, the GDPR does not specifically refer to the making of a request to exercise a right on behalf of a data subject (i.e., here, a parent/ guardian). Individual EU Member States may have specific rules concerning the issue. However, the European Data Protection Board has issued Guidelines on Data Subject Rights – The Right of Access, which contains guidance briefly addressing this issue. These Guidelines specifically note that children are data subjects in their own rights and, as such, the right of access belongs to the child. However, depending on the maturity and capacity of the child, the Guidelines also recognise that the holder of parental responsibility may need to act on behalf of the child in this regard. When developing procedures in this regard, the Guidelines state that controllers should have particular regard to the best interests of the child principle contained in Article 3(1) of the UNCRC), in particular where the right of access is exercised on behalf of the child, for example by the parent/ guardian.
Can children make complaints on their own behalf directly to your national data protection/ privacy regulator(s)?
Yes.
Article 77 of the GDPR grants every data subject the right to lodge a complaint with a supervisory authority. This right is not explicitly restricted by age, meaning that children theoretically can make complaints on their own behalf to supervisory authorities. Individual EU Member States may have specific rules concerning the issue.
Generally speaking, however, whether complaints can be made on their own behalf directly by children will depend on the capacity and best interests of the child. In practice, this means that older children may be able to lodge complaints directly whereas younger children will often require adult assistance or representation. The particular practice of each supervisory authority may vary by reference to specific national guidance and/ or specific national rules/ procedures for handling complaints from children.
Are there any particular requirements/ prohibitions related to:
a. processing specific types of children’s personal data;
b. carrying out specific processing activities involving children’s personal data; and/ or
c. using children’s personal data for specific purposes.
Yes.
a. processing specific types of children’s personal data; There have been several decisions issued by the Irish Data Protection Commission (DPC) as the lead supervisory authority under the GDPR. These decisions reflect the outcome of the Article 60 GDPR co-decision making process involving all of the members of the European Data Protection Board (EDPB) and indicate that making children’s online accounts and/or contact details public-by-default should be avoided. The DPC has also emphasised the importance of: appropriate transparency information for children concerning their use of specific features of a service where it may have privacy consequences for them, as well as adhering to the data protection by design principle and the controller obligation to account of varying likelihood and severity of risks to data subjects (child users of a service).
b. carrying out specific processing activities involving children’s personal data; and/ or Decisions from the DPC referred to above indicate the view of the EDPB and DPC that, amongst other things, data protection assessments should be carried out in relation to: the use of age assurance/ age verification mechanisms; the use by the service of under 18 users; and the potential risks posed to underage users (i.e. those not permitted by the terms of service) by the service. The EDPB has indicated as part of the decision making process in one case that it does not consider neutral age gates to be effective forms of age verification.
c) using children’s personal data for specific purposes.
Recital 71 of the GDPR states that measures relating to solely automated decision-making, including profiling, which produce legal or similar effects should not be used for children. In its Guidelines on Automated Individual Decision making and Profiling, the EDPB has stated that exceptions to the rule against this type of processing should not be relied upon in relation to the processing of children’s personal data other than in limited circumstances necessary to protect their welfare. In practical terms, this means that there is a very high threshold to meet before children’s personal data can be used for automated decision-making or profiling. (In addition, Article 28 of the DSA which applies to online platforms prohibits advertising based on profiling to children. For further information, see the response to this question.)
Has there been any enforcement action by your national data protection/ privacy regulator(s) concerning the processing of children’s personal data? In your answer, please include information on the volume, nature and severity of sanctions.
Yes.
See the information here concerning decisions issued by the Irish Data Protection Commission (DPC) as the lead supervisory authority under the GDPR. These reflect the outcome of the Article 60 GDPR co-decision making process which involves the members of the European Data Protection Board.
Sanctions imposed by the DPC in these cases have involved the issuing of reprimands, orders to bring processing into compliance and fines in the hundreds of millions of Euro.
Are there specific rules concerning electronic direct marketing to children?
No.
Where unsolicited direct marketing to any person is carried out through the sending of electronic mail (i.e., any text, voice, sound or image message including SMS text message), the ePrivacy Directive (as transposed in the relevant EU Member State) will apply to such communications. The ePrivacy Directive is not specifically directed at communications made to children but, regardless of whether the communications is sent to an adult or a child, the general rule is that the consent of the individual recipient is required. This must be GDPR-standard consent) although there may be exemptions which may be relied upon in limited circumstances.
National rules which transpose the ePrivacy Directive set out specific requirements as regards direct marketing; these vary across the respective EU Member States.
Are there specific rules concerning the use of adtech tracking technologies, profiling and/or online targeted advertising to children?
Yes.
The Digital Services Act (“DSA”) prohibits online platforms from targeting advertising at children based on profiling (as defined in Article 4(4) GDPR) which uses their personal data where the online platform in question is reasonably certain that the recipient is a child. This has been confirmed by the European Commission in its draft Article 28 Guidelines. (For further information on the draft Article 28 Guidelines, see the response to this question.)
In addition, Recital 38 of the GDPR notes that the specific protection which children merit with regard to their personal data particularly applies in the context of the use of personal data of children for the purposes of marketing. In its 2013 Opinion on Apps on Smart Devices, the European Data Protection Board’s (“EDPB”) predecessor, the Article 29 Working Party, stipulated that, in the best interests of the child, organisations should not either directly or indirectly process children’s personal data for behavioural advertising purposes, as to do so would be outside the scope of a child’s understanding and therefore constitute unlawful processing.
The EDPB has reiterated this principle in its Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, in which it states that organisations should, in general, avoid profiling children for marketing purposes, due to their particular vulnerability and susceptibility to behavioural advertising.
Further, Article 6a of the Audiovisual Media Services Directive (Directive 2010/13/EU) (“AVMSD”), as amended, provides that personal data of minors collected / generated by media service providers for the purposes of complying with the separate obligation to protection children from audiovisual media services which may impair their physical, mental or moral development, not be processed for commercial purposes, such as direct marketing, profiling and behaviourally targeted advertising. These rules have been transposed in national laws concerning the regulation of audiovisual media services across Member States.
Are there specific rules concerning online contextual advertising to children?
Yes.
Contextual advertising is generally understood to be advertising that does not involve the processing of personal data, however where it does involve such processing the rules of the GDPR (and relevant EU Member State implementing the GDPR apply).
Contextual advertising is subject to general EU rules, including the Unfair Commercial Practices Directive (UCPD) and the Audiovisual Media Services Directive (AVMSD), which prohibit unfair, misleading, or aggressive practices that exploit the inexperience or credulity of minors. (See more information here).
Advertisers must also ensure that such communications are clearly recognisable as commercial content and do not encourage unsafe or socially irresponsible behaviour.
In this context, contextual advertising must not rely on or include so-called “dark patterns” – manipulative design choices aimed at exploiting users’ vulnerabilities, including those of children. While there is currently no express ban on dark patterns at EU level, the European Commission is actively preparing the Digital Fairness Act, expected in 2026, which is likely to introduce explicit prohibitions on dark patterns targeting minors. In the meantime, such practices may already be prohibited under the UCPD where they amount to misleading or aggressive commercial practices, and some EU Member States have case law or sectoral guidance on this point.
Finally, contextual advertising to children must also comply with any applicable EU or national legislation or local standards that prohibit or restrict advertising of certain products to minors – including, for example, alcohol, gambling, sugary drinks, or age-restricted content. These sector-specific rules vary across jurisdictions and must be taken into account when designing advertising aimed at, or accessible by, children.
Has there been any regulatory enforcement action concerning advertising to children? In your answer, please include information on the volume, nature and severity of sanctions.
Yes.
At EU level, the European Commission has launched investigations under the EU Digital Services Act into a number of online platforms concerning, among other things, advertising practices likely to exploit minors’ vulnerabilities, including the design of algorithmic systems that may foster addictive behaviour. For further information , see the response to this question.
At national level, various audiovisual regulators have taken action under transposed Audiovisual Media Services Directive (AVMSD) provisions.
Additionally, the Consumer Protection Cooperation (CPC) Network coordinated enforcement against an online game, due to misleading advertising targeting children. This resulted in changes to its commercial practices. These cases illustrate increased scrutiny of advertising to children across the EU, with sanctions ranging from binding commitments to administrative fines.
Moreover, in 2025, the ICPEN Sweep revealed widespread manipulative design practices in mobile and online games targeting children, including sneaking, nagging, and obstruction techniques. These practices are already prohibited under the UCPD where they amount to unfair commercial practices, and their identification has reinforced the EU-level push for stricter future regulation, including under the forthcoming Digital Fairness Act.
At what age does a person acquire contractual capacity to enter into an agreement to use digital services?
EU law does not harmonise general contract law or the age of contractual capacity, meaning that the age at which a person acquires contractual capacity in this regard varies across Member States. Generally speaking, however, most Member States set this age at 18.
Do consumer protection rules apply to children?
Yes.
The Unfair Commercial Practices Directive (Directive 2005/29/EC) (UCPD)
The UCPD applies to business-to-consumer commercial practices, including online targeted advertising using personal data, regardless of whether the consumer is an adult or a child. Advertisements must not breach professional diligence or significantly distort the economic behaviour of the average consumer, including when targeting specific consumer groups.
Misleading advertisements–whether through false claims, deceptive presentation, or omission of material information—are prohibited if they cause or are likely to cause the consumer to take a transactional decision they would not have taken otherwise. This includes misrepresenting a product’s nature, characteristics, or availability. An advertisement can be misleading also through the omission of material information which an average consumer would take account of when making a transactional decision.
For further information on misleading practices, see the response to this question.
The Audiovisual Media Services Directive (Directive 2010/13/EU), as amended (AVMSD)
The AVMSD addresses “audiovisual commercial communications” (defined as “images with or without sound which are designed to promote, directly or indirectly, the goods, services or image of a natural or legal person pursuing an economic activity; such images accompany, or are included in, a programme or user-generated video in return for payment or for similar consideration or for self-promotional purposes. Forms of audiovisual commercial communications include television advertising, sponsorship, teleshopping and product placement”.)
Consumer Rights Directive (2011/83/EU)
While the Consumer Rights Directive (2011/83/EU) does not contain child-specific provisions, its protections apply to all consumers. Children may benefit from these rules where they are capable of entering into contracts under national law.
All the above Directives have been transposed across the EU and provide a harmonised level of protection for consumers, including minors.
Are there any consumer protection rules which are specific to children only?
Yes.
The Unfair Commercial Practices Directive (Directive 2005/29/EC) (UCPD)
Under the UCPD, including direct exhortations to children to buy advertised products or persuade their parents or other adults to buy advertised products for them is an aggressive commercial practice and thus prohibited.
Additionally, if a commercial practice is likely to materially distort the economic behaviour of a clearly identifiable group of consumers who are particularly vulnerable to the practice or the underlying product by reason of, among other things, their age, in a way which the trader could reasonably be expected to foresee, then it must be assessed from the perspective of the average member of that group (without prejudice to the common and legitimate advertising practice of making exaggerated statements or statements which are not meant to be taken literally).
The Audiovisual Media Services Directive (Directive 2010/13/EU), as amended (AVMSD)
The AVMSD prohibits audiovisual commercial communications that may harm minors physically, mentally, or morally. Audiovisual commercial communications must not directly exhort minors to buy or hire a product or service by exploiting their inexperience or credulity, directly encourage them to persuade their parents or others to purchase the goods or services being advertised, exploit the special trust minors place in parents, teachers, or other persons, or unreasonably show minors in dangerous situations. Account must be taken of the use made of the personal data of the people concerned by the trader when assessing its compliance with these rules. This is a broad prohibition which requires a granular and rigorous consideration of the advertisement in question.
The AVMSD also contains a stricter prohibition in Article 6a, which is aimed at ensuring the safety of minors. Article 6a requires Member States to take appropriate measures to ensure that harmful content is made inaccessible to minors through measures like age verification or time restrictions. The most harmful content—such as gratuitous violence or pornography—must face the strictest controls.
Digital Services Act (DSA)
In addition, Article 28(1) of the DSA requires providers of online platforms that are accessible to minors to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors on their service. The European Commission’s Article 28 Guidelines (finalised in July 2025) specifically address commercial practices, stating that online platforms should adopt the following measures (among others) in order to ensure that their commercial practices comply with Article 28 DSA: • prevent the exploitation of minors’ lack of commercial literacy by reference to their age, vulnerabilities and limited capacity for critical engagement; • ensure that minors are not exposed to harmful, unethical and unlawful advertising; • ensure that minors are not exposed to AI systems integrated in the platform that influence or nudge children for commercial purposes, particularly through conversational or advisory formats such as chatbots; • ensure child-friendly and accessible transparency as regards commercial communications on the platform – these should be clearly visible and consistent; • prevent minors’ exposure to marketing and communications of products or services that could adversely impact their privacy, safety and security; • ensure minors are not exposed to hidden or disguised advertising, whether placed by the online platform provider or other users; • ensure age-appropriate transparency of economic transactions, avoiding the use of certain virtual currencies and other tokens or coins that can be exchanged with real money and used to purchase virtual items (thus causing unwanted spending); • prevent minors’ exposure to in-app or in-game purchases that are or appear to be necessary when accessing purportedly free services; • prevent minors’ exposure to practices that can lead to excessive/ unwanted spending or addictive behaviours, e.g. loot boxes; • prevent minors’ exposure to manipulative design techniques; • prevent minors’ exposure to unwanted purchases; and • review the platform’s policy on allowing children to make purchases – this should be based on the evolving capacities of the child and consider that younger children may not understand money or spending, so they should not be allowed to take part in financial transactions.
These rules establish a harmonised EU framework that recognises children as a particularly vulnerable group. In particular, Annex I of the UCPD prohibits direct exhortations to children, and the AVMSD lays down detailed safeguards against exploitative or harmful audiovisual commercial communications involving minors.
In addition, the 2025 ICPEN Sweep involving more than 20 national consumer authorities identified manipulative design practices (sneaking, nagging, obstruction) in hundreds of online and mobile games, many of which are accessible to children. Although these practices are already covered by the UCPD prohibition on unfair practices, the sweep has intensified enforcement attention at both EU and national levels and may support calls for the development of future harmonised legislation under the proposed Digital Fairness Act.
Has there been any regulatory enforcement action concerning consumer protection requirements and children’s use of digital services? In your answer, please include information on the volume, nature and severity of sanctions.
Yes.
The European Commission and the Consumer Protection Cooperation (CPC) Network have pursued enforcement actions in this area.
The CPC Network, has taken a case against a games platform concerning the use of pressure tactics and unclear in-game purchasing mechanisms directed at children. The investigation resulted in the platform committing to improve transparency and disable prompts which were alleged to be misleading.
These actions are part of a broader EU-level enforcement effort to address allegedly unfair or manipulative commercial practices in digital services accessed by minors, whether via national consumer authorities or under the Digital Services Act (DSA).
Under the DSA, the Commission has opened proceedings against a number of online platforms related to design features that may mislead or manipulate minors, including recommender systems and advertising interfaces. Investigations also concern the alleged lack of effective age verification tools.
In May 2025, the European Commission opened formal proceedings under the DSA against four major adult websites, identifying potential shortcomings in age verification mechanisms meant to prevent minors’ access, including reliance on one-click self-declarations.
For further information in relation to these DSA investigations, see the response to this question.
National regulators have also addressed in-app purchases and dark patterns in children’s services. While sanctions vary, they increasingly involve binding commitments, transparency obligations and potential financial penalties.
Are there any age-related restrictions on when children can legally access online/ digital services?
Yes.
There are EU-level laws which implement age-related restrictions on the ages at which children can legally access specific goods and services. The following is a non-exhaustive list of goods and services to which such restrictions apply:
- gambling
- alcohol, cigarettes, vaping; and
- pornography.
To the extent such goods and services are accessed online by children, such restrictions will generally also apply.
See also the information here.
Are there any specific requirements relating to online/ digital safety for children?
Yes.
The EU approach to online/digital safety is informed by the European strategy for a better internet for kids (BIK+), which was published in May 2022. BIK+ is centred around three pillars:
1. Safe Digital Experiences: protecting children from harmful and illegal online content, conduct and risks. 2. Digital Empowerment: enabling all children (including those in situations of vulnerability) to acquire necessary skills and competences to make sound choices and express themselves safely and responsibly online. 3. Active Participation: respecting children by giving them a say in the digital environment.
In tandem with the BIK+, a number of EU laws and regulations specifically deal with online/ digital safety of children, albeit that these terms are not defined in EU law: The Digital Services Act (DSA)
The DSA applies to intermediary services offered to recipients of the service that have their place of establishment or are located in the EU, irrespective of where the providers of those intermediary services have their place of establishment.
Article 28(1) of the DSA requires providers of online platforms that are accessible to minors to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors on their service. In July 2025, the European Commission published Guidelines on measures to ensure a high level of privacy, safety and security for minors (Article 28 Guidelines). The Article 28 Guidelines confirm that an online platform may be considered to be accessible to minors (1) even where the relevant terms of service provide for a minimum user age of 18; or (2) when the provider is otherwise aware that minors are on the platform. In addressing requirements around the online/ digital safety of children, the Article 28 Guidelines also set out the measures which online platforms should take in order to comply with Article 28(1) DSA, i.e. to ensure a high level of privacy, safety and security for minors online. These include the following requirements:
1. Risk assessment obligations Providers must conduct comprehensive risk assessments examining: • The likelihood of minors accessing the service; • The risks to minors’ privacy, safety and security based on the 5C typology of risks (Content, Conduct, Contact, Consumer, Cross-cutting risks); • Current protective measures in place; • Additional measures needed; • The impact on children’s rights. 2. Age Assurance Requirements The Article 28 Guidelines address age assurance obligations for compliance with Article 28. For further information, see the response to the next question below. 3. Default Settings Requirements In order to comply with data protection by design obligations, minors’ accounts should be set to the highest level of privacy, safety and security by default. The following should be specific defaults as a minimum: • Interactions with minors should only be enabled for contacts; • No account should be able to download or take screenshots of contact, location or account information or any content uploaded or shared by minors; • Minors’ content should only be visible to accepted contacts; • No one should be able to see the minor’s activities such as “liking” content or “following” another user; • Geolocation, microphone, photo access, camera, content synchronisation and tracking features should be turned off; • Push notifications should be turned off by default and during sleeping hours; • Features promoting excess use of the platform (e.g. counters, streaks) should be disabled; • Any functionalities that increase a user’s agency over their interactions should be turned on (e.g. giving users an opportunity to think before they post) • Recommendations of other accounts should be turned off; and • Body image filters should be turned off. Platforms should consider whether, depending on minors’ ages and evolving capacities and the outcome of the risk review, whether it is necessary to go beyond the minimum standard and implement more restrictive default settings. Child-friendly controls, adapted to the child’s age, should be put in place with age-appropriate warnings deployed when a child attempts to change their default settings. These should clearly explain the impact of making any such changes.
The Audiovisual Media Services Directive (Directive 2010/13/EU), as amended (“AVMSD”)
The AVMSD applies to providers of audiovisual media services who are established in an EU Member State and provide services that fall under its scope. The AVMSD outlines online safety obligations for audiovisual media services, focusing on protecting minors, prohibiting hate speech, and promoting accessibility. The specific obligations set out in the AVMSD are given further effect through national laws of Member States. Please see the specific EU country responses for further information.
Regulation (EU) 2021/784 addressing the dissemination of terrorist content online (“Terrorist Regulation”)
Where a provider of digital services falls within the definition of a “hosting service provider” under the Terrorist Regulation, it is subject to specific obligations relating to the removal of, and prevention of dissemination of, online terrorist content. The Code of Conduct on Countering Illegal Hate Speech Online
While not legally binding, this code was developed by the European Commission in 2016 in conjunction with a number of multinational online platforms. Participation is voluntary, and the code is designed, amongst other things, to assist users in notifying illegal hate speech on their social media. The Code of Practice on Disinformation
Participation in this code is also voluntary, and sets out industry-agreed self-regulatory standards to combat disinformation. While it was originally established in 2018, it is now part of the broader regulatory framework in conjunction with the DSA, and has been converted into a Code of Conduct under the regulatory framework of the DSA becoming effective with its commitments auditable as of July 2025.
Are there specific age verification/ age assurance requirements concerning access to online/ digital services?
Yes.
The European Data Protection Board (EDPB) adopted Statement 1/2025 on Age Assurance on 11 February 2025. (In line with a recent report from the European Commission, the term “age assurance” is used in the Statement as ‘the umbrella term for the methods that are used to determine the age or age range of an individual to varying levels of confidence or certainty’ noting the three primary categories of age assurance mentioned in that report: age estimation, age verification and self-declaration). The Statement establishes a series of obligations for service providers and third parties implementing age assurance, which are structured around ten key principles to design GDPR-compliant age assurance:
- Full and effective enjoyment of rights and freedoms: Age assurance must uphold all fundamental rights, not only data protection rights. The best interests of the child must be the primary consideration to ensure respect for children’s rights including safety, information access and participation.
- Risk-based assessment and proportionality: Service providers must comply with the following key obligations in this regard: (1) adopt a risk-based approach; (2) demonstrate the necessity and proportionality of the chosen age assurance mechanism; (3) carry out data protection impact assessments when selecting that mechanism; (4) use the least intrusive mechanisms available that remain effective; and (5) assess the scope, extent and degree of interference of the selected mechanism with rights and freedoms.
- Prevention of data protection risks: Specific safeguards in this regard are listed as follows: (1) implementation of safeguards to prevent age assurance from introducing unnecessary data risks, such as profiling or tracking; (2) prohibition of its use for unrelated purposes or commercial targeting; (3) ensuring alternatives are available to avoid coercion; and (4) regular assessment of whether the selected methods and technology function in line with their purposes.
- Purpose limitation and data minimisation: Specific obligations in this regard are listed as follows: (1) process only age-related data strictly necessary for clear, lawful purposes under Article 5(1)(b) GDPR; (2) prohibit incompatible reuse or data combination; (3) apply technical (e.g. privacy-enhancing technologies) and organisational safeguards (e.g. policies, contracts) to prevent repurposing of data; and (4) collect only data that is necessary, adequate, and relevant for the processing purposes.
- Effectiveness of age assurance: Age assurance must demonstrably achieve adequate effectiveness, evaluated on three key aspects: (1) accessibility; (2) reliability; and (3) robustness. (The statement notes that robustness has little meaning in the context of the self-declaration of an age-related attribute, since the reliability of this method depends mostly on the goodwill of the user.)
- Lawfulness, fairness and transparency: In this regard, controllers must (1) ensure an applicable legal basis for the age assurance processing; (2) comply with transparency obligations under Articles 12 – 14 GDPR; (3) convey transparency information to children in a clear and comprehensible manner; and (4) be transparent about the impacts of different age assurance mechanisms from a data protection perspective.
- Automated decision-making: Automated decision-making in age assurance is not prohibited; however, when it does occur in age assurance, controllers must (1) provide suitable safeguarding mechanisms; (2) provide remedies and redress mechanisms; (3) be particularly cognisant of children; and (4) implement suitable measures such as viable alternatives, redress mechanisms and human intervention where applicable.
- Data protection by design and by default: Controllers must (1) apply appropriate technical and organisational measures under Article 25 GDPR; (2) use up-to-date technologies, especially those that support user-controlled data and secure local processing; (3) consider privacy-preserving tools like single-use credentials or zero-knowledge proofs for high-risk cases; and (4) regularly update systems to reflect advances in privacy-enhancing technologies.
- Security of age assurance: Both controllers and processors must ensure robust security by (1) applying risk-based security measures under Article 32 GDPR; (2) using trust modes, pseudonymisation and encryption to prevent breaches; (3) limiting data storage with short retention or no-log policies; (4) ensuring that systems can detect and respond to incidents quickly; and (5) building a resilient age assurance set up with flexible and independent components.
- Accountability: Service providers and third parties must establish comprehensive governance frameworks to ensure accountability for their age assurance approach and underlying data protection obligations.
In addition to the EDPB Statement, age assurance is also being addressed at EU-level by the European Commission on the protection of minors under Article 28 of the Digital Services Act (DSA). Article 28(1) of the DSA requires providers of platforms that are accessible to minors to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors on their service.
In July 2025, the Commission issued its Article 28 Guidelines (Article 28 Guidelines), which aim to support providers of online platforms “accessible to minors” with meeting their obligations under Article 28(1) DSA. The Article 28 Guidelines contain a prescriptive set of age assurance obligations for compliance with Article 28(1) DSA:
Core Assessment Requirements
Providers must conduct a proportionality and appropriateness assessment before determining whether to put in place access restrictions supported by age assurance measures to determine whether and which age assurance method(s), i.e., age verification, age estimation, and self-declaration, are most suitable to address the risks that their service might pose to minors.
Age Verification Requirements
Age verification is mandatory where:
- required by law (e.g. alcohol, tobacco or nicotine-related products, drugs, any type of pornography, gambling);
- contractual terms restrict access to 18+; or
- high risks to minors’ rights cannot be mitigated otherwise.
Government-issued IDs and the EU Digital Identity Wallet are recommended for secure verification. (For further information on the EU Digital Identity Wallet, see the response to this question.)
Age estimation methods can complement age verification and can be used in addition to age verification or as a temporary alternative in cases where age verification measures that respect the criteria of “effectiveness” are not yet readily available (see below).
Age Estimation Requirements
Age estimation is appropriate where:
- terms and conditions require a minimum age below 18; or
- providers identify medium risks exist that cannot be mitigated by less intrusive measures.
Effectiveness Criteria
Providers must assess age assurance methods for:
- Accuracy – Must reliably determine age and be regularly reviewed;
- Reliability – Must function consistently with trustworthy data sources;
- Robustness – Must resist circumvention;
- Non-Intrusiveness – Limit data use to what is strictly necessary;
- Non-Discrimination – Must be accessible to all minors.
Where age assurance measures do not meet this criteria, they cannot be deemed to be appropriate and proportionate.
Prohibited & Required Practices
- The Guidelines make it clear that self-declaration is insufficient under Article 28(1).
- Platforms must make more than one age assurance method available in order to avoid exclusion of eligible users.
- Providers must provide a free, electronic redress mechanism for users to complain about incorrect age assessments.
Transparency
Platforms must clearly explain age assurance methods, including third-party involvement, in child-friendly, accessible formats.
Data Protection Compliance
Where personal data is processed, providers must follow the EDPB Statement on Age Assurance and apply GDPR Article 5(1)(c) data minimisation principles.
Are there requirements to implement parental controls and/or facilitate parental involvement in children’s use of digital services?
Yes
Article 28b(1) of the Audiovisual Media Services Directive (AVMSD) places an obligation on Member States to ensure appropriate measures are adopted by video-sharing platforms to protect minors from harmful content. These include measures consisting of parental control systems (Article 28b(3)(h)).
In addition, parental controls may be an aspect of measures by which a provider complies with the following obligations in EU law:
- GDPR: As an additional safety feature to comply with the higher standard of protection for the processing of children’s personal data under Articles 24 and 25.
- Digital Services Act: Article 28(1) requires providers of online platforms to “put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service” – the implementation of parental controls can serve as a measure for compliance with this obligation.
Digital Services Act: Article 35(1) requires Very Large Online Providers (VLOPs) and Very Large Online Search Engines (VLOSEs) to put in place "reasonable, proportionate and effective mitigation measures", which are tailored to the specific systematic risks identified. Such measures may include the implementation of parental controls, as contemplated by Article 35(1)(j) which refers to VLOPs and VLOSEs taking targeted measures to protect the rights of the child, including age verification, parental control tools, and tools aimed at helping minors signal abuse or obtain support, as appropriate.
Has there been any regulatory enforcement action concerning online/ digital safety? In your answer, please include information on the volume, nature and severity of sanctions.
Yes.
The European Commission has opened a number of investigations related to the protection of minors under the Digital Services Act (DSA) which remain ongoing. These focus on a range of compliance issues including algorithmic features that may foster addictive behaviour or “rabbit hole” effects, age verification tools; compliance with the requirement to have a high level of privacy, safety and security for minors, particularly with regard to default privacy settings, and enforcement of terms of service.
Are there any existing requirements relating to children and AI in your jurisdiction?
Yes
The EU Artificial Intelligence Act (Regulation 2024/1689) (AIA) contains several provisions which have applied since February 2025 which indirectly address protection of children (amongst other persons) in relation to prohibited AI systems:
- AI systems that exploit a person’s or group of persons’ vulnerability due to (amongst other things) age with the objective/effect of materially distorting their behaviour in a manner that causes (or is reasonably likely to cause) harm are strictly forbidden (Article 5(b) AIA).
- Article 5(f) of the AIA prohibits AI systems for emotion recognition in educational institutions (and workplaces), except when used for medical or safety purposes.
The AIA also contains specific provisions addressing children’s protection and rights in the context of high-risk AI systems:
- In relation to any AI systems classified as high-risk, providers of such systems must consider whether the AI system is likely to have an adverse impact on persons under the age of 18 as part of its implementation of a risk management system (Article 9(9) AIA). Similarly, real-world testing of high-risk AI systems should appropriately protect any testing subjects belonging to vulnerable groups, e.g. due to their age (Article 60(4)(g)).
- Annex III (3), in conjunction with Article 6(2) of the AIA classifies certain AI systems used in education as high-risk. This includes AI systems used to determine school admissions, assess learning outcomes, guide students’ educational paths, or monitor and detecting prohibited behaviour during exams.
More broadly, the AIA requires market surveillance authorities to to pay particular attention to the risks an AI system poses to vulnerable groups (this would include children), when evaluating an AI system which is considered to present a risk (per Article 79(1) for compliance with the AIA.
Are there any upcoming requirements relating to children and AI in your jurisdiction?
Yes.
The EU Artificial Intelligence Act (Regulation 2024/1689) becomes applicable on a staggered basis – while the prohibited practices ban is in force since 2 February 2025, its substantive provisions begin to come into force from 2 August 2025.
Has there been any other regulatory enforcement activity to date relevant to children’s use of digital services? In your answer, please include information on the volume, nature and severity of sanctions.
No.
N/A.
Are there any other existing or upcoming requirements relevant to children’s use of digital services?
Yes.
1. EU Digital Identity Wallet – Harmonised Age Verification The European Commission is currently working towards an EU-harmonised approach to age verification, collaborating with Member States to develop a unified age verification solution. This solution is built on the European Digital Identity Wallet framework and supports compliance with Article 28 of the Digital Services Act.
In July 2025, the Commission released the first version of an EU white-label age verification blueprint, to be used as a basis for a user-friendly and privacy-preserving age verification method across Member States. The release of this blueprint launches a trial phase to test and improve the age verification solution, to be done in collaboration with Member States, online platforms and users. Denmark, France, Greece, Italy, and Spain will be the first to test the solutions, either by adding it to their national digital wallets or by launching their own age verification apps. Market players can also voluntarily use and build on this software solution.
The Commission plans to scale the pilot and related support to other Member States, in coordination with national authorities and Digital Services Coordinators. At a later stage, all Member States will receive tailored strategies to integrate the solution into their digital wallets or to publish localised apps for end users.
2. European Data Protection Board (EDPB) Guidelines
- The EDPB’s guidance on children’s data protection issues are expected to be published by end of 2025, per the EDPB Work Programme 2024 – 2025.
- Per its Strategy 2024 – 2027, the EDPB also intends to provide guidance on the interplay between the application of the GDPR and other legal acts, particularly the EU Artificial Intelligence Act and the Digital Services Act.
3. Proposal for a Regulation (EU) laying down rules to prevent and combat child sexual abuse The European Commission has published this proposal for legislation which, if passed by the EU, would require hosting services and certain other service providers to search, detect, report and remove/ disable access to child sexual abuse material and to assess and minimise the risk that their services are used for online child sexual abuse. This draft legislation is currently progressing through the EU law making process.
4. Proposal for a Digital Fairness Act The European Commission is working on a Digital Fairness Act proposal, also aimed at addressing children’s use of digital services. In this regard, the EU Commissioner for Democracy, Justice, the Rule of Law and Consumer Protection (the EU Commissioner) has specifically indicated that this would ensure that consumers are not exploited for commercial purposes, that social media influencers are not misleading consumers and that children are sufficiently protected online. The Digital Fairness Act is expected to address dark patterns, marketing by social media influencers, addictive design of digital products and unfair personalisation practices. A public consultation is expected to take place with a legislative proposal for a Digital Fairness Act then expected in mid-2026. In broader terms, the EU Commissioner has stated that the protection of children from manipulative practices when playing online games will remain a priority in EU consumer policy going forward. 5. Digital Minority Age Discussions Several Member States – including France, Spain and Greece – have formally called on the European Commission to introduce an EU-wide rule setting a minimum digital age of access to social media platforms, sometimes referred to as a “digital age of majority.”
In a joint statement issued in May 2025, these countries requested legislative measures to better protect minors online by raising the minimum age for social media use (e.g. to 15 years) and reinforcing age verification obligations for platforms. The proposal is motivated by increasing concern about children's mental health, exposure to harmful content, and manipulative online design practices.
This initiative reflects growing political momentum across EU Member States to harmonise digital access thresholds and ongoing work at the Commission level on the Digital Fairness Act and age assurance under the DSA. Although there is no binding legislative proposal yet, the issue is expected to be discussed further in 2025–2026 as part of broader digital child protection reforms. 6. Influencer Marketing The EU is also considering harmonised rules on influencer marketing: the upcoming Digital Fairness Act (expected 2026) will likely regulate #ad disclosures, dark patterns, and influencer-specific practices. The European Council has called for such EU-wide influencer legislation, following national-level moves in countries like France, Italy and Belgium. If introduced it is likely that this would also affect advertising towards minors.