Which topic would you like to know more about?
What is the legal age of adulthood/ majority in your jurisdiction? Are all persons below this age considered a child/ minor?
Article 240 of the Spanish Civil Code provides that the age of majority is 18 years. Persons under the age of 18 are therefore considered as minors for the purposes of Spanish law.
Has the UNCRC been directly incorporated into national law in your jurisdiction?
Yes.
Organic Law 8/2021, of June 4, 2021, on the comprehensive protection of children and adolescents against violence partly incorporates the UNCRC, since it establishes that this law is consistent with the national and international legal system, complying with the provisions of Article 19 of the United Nations Convention on the Rights of the Child, which establishes the obligation of the States Parties to protect children and adolescents against all forms of abuse.
Is there an ombudsperson/ commissioner for children in your jurisdiction?
Yes.
Spain has an Ombudsman ‘Defensor del Pueblo’. The Ombudsman and the corresponding agencies in the autonomous community have competence in matters affecting children and youth, and the national agency can bring cases to the courts. In Andalusia, the regional ombudsman agency (the Defensor del Pueblo Andaluz) has a deputy ombudsman (Defensor del Menor) for children and young people, like its equivalent in Catalonia (the Síndic de Greuges).
If there is an ombudsperson/ commissioner for children in your jurisdiction, do they have any responsibility for upholding children’s rights in the digital world or does the relevant regulator have sole competence?
No.
The main function of the Ombudsman ‘Defensor del Pueblo’ is to deal with complaints raised by children or adults about incorrect or irregular actions of administrations, institutions and public authorities in which the rights of children or adolescents are affected.
In the event that, after investigating a complaint, it is concluded that the action has been incorrect, the Ombudsman issues a resolution recommending that the responsible administration modify its actions or adopt measures to prevent such actions from recurring.
Digital rights fall under the Spanish Data Protection Agency (AEPD) and the National Commission on Markets and Competition (CNMC) for audiovisual content. In addition, the Expert Committee for the Creation of Safe Digital Environments for Children and Youth, created by the Ministry of Youth and Children in 2024, analyses the risks associated with the use of digital technologies by minors and proposes measures to ensure their safety in digital environments.
Is there any standalone requirement to collect the consent of one or both parents when processing a child’s personal data (irrespective of any other obligations, e.g. the requirement to have a legal basis for processing)?
No.
The requirement to collect parental consent under the GDPR only applies where the service provider is relying on consent as the legal basis under the GDPR for processing a child’s personal data (i.e. if another legal basis is relied on for processing, there is no requirement to collect parental consent for processing the child’s data). Please see more information on the GDPR-level requirements here.
As per article 7.2 of the Spanish Data Protection Act, Organic Law 3/2018, of 5 December, on the Protection of Personal Data and the guarantee of digital rights, the processing of data of minors under 14 years of age, when based on consent, shall only be lawful if the consent of the holder of parental authority or guardianship is given, to the extent determined by the holders of parental authority or guardianship.
Entirely separate to the requirement to collect parental consent under Article 8(1) GDPR is that a digital service provider (controller) may decide to seek parental permissions for a child to access different settings/ features/ functionalities etc. as part of the measures it implements under Articles 24 and 25 of the GDPR to ensure a high level of protection for child users.
At what age can children legally consent to the processing of their own personal data, such that parental permission/ consent is not required?
As per article 7 of the Spanish Data Protection Act, Organic Law 3/2018, the processing of personal data of a minor may only be based on his or her consent if he or she is over 14 years of age.
However, it should be noted that there is currently a Preliminary Draft Organic Law for the Protection of Minors in Digital Environments (Draft Organic Law), which aims to ensure that children can access digital environments in a safe and positive way, protecting them from accessing inappropriate content that may affect their development. When the Draft Organic Law comes into force, the age of consent will be raised to the age of 16.
Are there specific requirements in relation to collection and/or verification of parental consent/ permission concerning the processing of a child’s personal data?
Yes.
Article 13.4 of the Royal Decree 1720/2007, of 21 December, approving the Regulation implementing the former 1999 Spanish Data Protection Act (still applicable in respect of those provisions which do not contradict and complement the existing Spanish Data Protection Act, Organic Law 3/2018), states that it shall be the responsibility of the data controller to articulate the procedures that guarantee that the age of the minor and the authenticity of the consent provided, if applicable, by the parents, guardians or legal representatives has been effectively verified.
Unfortunately, these procedures are not specified by law. Existing recommendations from the Spanish Data Protection Authority, Agencia Española de Protección de Datos (AEPD) include :
- Implement granularity, establishing mechanisms to enable and disable functionalities based on the needs of the user. Personal data corresponding to a deactivated functionality should not be processed.
- Mechanisms must be established to avoid requesting access permissions to system resources that are unnecessary for the functionalities that are going to be used. For example, if the website does not need to use the child's GPS location functionality, it does not seem necessary for the application to have to access geolocation data continuously and even in the background.
- Data subjects should be informed of the processing of personal data that may be introduced by third-party libraries (usage statistics, error reporting, user authentication or advertising) so that valid consent can be obtained before such processing is carried out.
- Backends hosted on cloud servers must be governed by a contract or legal link that meets the requirements set out in the GDPR. The data controller must show diligence in the selection of cloud service providers and must comply with the requirements set out in the GDPR in the processing contract, paying special attention to the privacy measures from the design, default, security, confidentiality commitment and security breach management procedure.
- Taking into account the volume, categories and profile of the individuals on whom the processing is carried out, measures must be maximised by applying the highest security standards.
See more information on the GDPR-level requirements here.
Are there any particular information or transparency requirements concerning the processing of children’s personal data?
Yes.
The controller must inform the data subject in a concise, transparent, intelligible and easily accessible manner, using clear and simple language, especially if the information is specifically addressed to a child. Please see more information on the GDPR-level requirements here.
Can children directly exercise their rights in relation to their personal data without the involvement of their parents?
Yes.
As per the Spanish Data Protection Authority’s, Agencia Española de Protección de Datos’s guidance, the exercise of rights of minors under 14 years of age shall always be carried out by the person with parental authority or by their guardians. Persons over 14 years of age are authorised to exercise these rights.
Can children make complaints on their own behalf directly to your national data protection/ privacy regulator(s)?
Yes.
Persons over 14 years of age are entitled to exercise their GDPR rights.
Note that there is currently a Preliminary Draft Organic Law for the Protection of Minors in Digital Environments (Draft Organic Law) in train. When the Draft Organic Law comes into force, the age of consent will be raised to the age of 16.
Are there any particular requirements/ prohibitions related to:
a. processing specific types of children’s personal data;
b. carrying out specific processing activities involving children’s personal data; and/ or
c. using children’s personal data for specific purposes.
Yes.
The Spanish Data Protection Authority, Agencia Española de Protección de Datos (AEPD) recommends the following measures when processing children’s personal data:
- Implement granularity, establishing mechanisms to enable and disable functionalities based on the needs of the user. Personal data corresponding to a deactivated functionality should not be processed.
- Mechanisms must be established to avoid requesting access permissions to system resources that are unnecessary for the functionalities that are going to be used. For example, if the website does not need to use the child's GPS location functionality, it does not seem necessary for the application to have to access geolocation data continuously and even in the background.
- Data subjects should be informed of the processing of personal data that may be introduced by third-party libraries (usage statistics, error reporting, user authentication or advertising) so that valid consent can be obtained before such processing is carried out.
- Backends hosted on cloud servers must be governed by a contract or legal link that meets the requirements set out in the GDPR. The data controller must show diligence in the selection of cloud service providers and must comply with the requirements set out in the GDPR in the processing contract, paying special attention to the privacy measures from the design, default, security, confidentiality commitment and security breach management procedure.
Taking into account the volume, categories and profile of the individuals on whom the processing is carried out, measures must be maximised by applying the highest security standards.
Additionally, see information on the EU-level requirements here.
Has there been any enforcement action by your national data protection/ privacy regulator(s) concerning the processing of children’s personal data? In your answer, please include information on the volume, nature and severity of sanctions.
Yes.
Yes, there have been several instances of Spanish regulatory authorities taking action against companies concerning the processing of children’s personal data. In particular, the Spanish Data Protection Authority, Agencia Española de Protección de Datos (AEPD) has taken the following actions: • The AEPD fined a sports club €1,000 (reduced to €600 for voluntary payment) for unlawfully pressuring a parent to consent to the collection of their child’s images as a condition for participating in sports activities. The AEPD ruled that this infringed Article 7 of the GDPR, as consent must be freely given and cannot be required for service. The club was also ordered to implement measures ensuring minors' images are not processed without valid parental consent. • The AEPD fined a cultural association €3,000 for sharing a minor’s image in three messaging groups with around 400 members without parental consent. Although the child’s face was partially covered with an emoticon, the AEPD found an infringement of Articles 6(1)(a) and 8(1) of the GDPR, citing negligence rather than intent and ruled that this did not ensure full anonymization. • The AEPD fined a company €10,000 for unlawfully processing minors' personal data. The case originated from a complaint by a mother whose four-year-old daughter attended a birthday party at the company's venue. Without parental consent, images of children at the event were taken and published as a social media post. Despite the mother’s request for removal, the images remained online for 24 hours. The company argued that the removal request was not received through the designated channel but acknowledged the incident and implemented stricter consent protocols. The AEPD determined that the company failed to comply with Article 6(1) of the GDPR, which requires a legal basis for processing personal data. Given the nature of the breach, the agency classified the infraction as severe. The company did not contest the findings within the given timeframe. As a result, the AEPD imposed the fine and mandated compliance measures. The decision is final in administrative proceedings, though the company may appeal before the National Court.
- The AEPD fined a private individual €10,000 for unlawfully recording and sharing a video of a 13-year-old minor on two social media platforms without parental consent. The video, which went viral, included personal details about the minor. The AEPD found violations of the GDPR, including processing special category data (€3,000), lack of lawful basis (€3,000), failure to minimize data (€3,000), and lack of transparency (€1,000). Additionally, the infringer is prohibited from processing data of minors under 14 without parental consent and must adopt compliance measures within one month. The decision is final unless appealed. • The AEPD has issued a warning to a minor under 14 years old, for taking and sharing a photo of four undressed minors in a sports club’s locker room without consent. The photo was taken and sent via a social media channel, eventually appearing on another user's profile with a caption. The investigation confirmed the unauthorized capture and dissemination of the image, infringing Article 6 of the GDPR. Given the minor's age and the need for educational awareness rather than punishment, the AEPD opted for a formal warning instead of a fine. • A spanish social media company (which was one of the major social media players in Spain in the beginning of the 2000s), after meeting with the AEPD in 2009, presented to the AEPD the measures adopted to remove children under 14 from its social network. The company implemented a process of purging profiles of minors under 14: in cases where profiles appeared to be under that age, the company sent them a request to provide a photocopy of their ID card or passport within 92 hours. If no reply was received, the company deleted the user’s profile. The AEPD positively valued this initiative and urged the company to extend the verification and checking system to all new registrations suspected of being under 14 years of age. • A website owner was fined €30,601 for, amongst other issues, not adequately preventing the signing up of minors on its social page. Allegedly, various checks were performed almost daily to authorise all applications for the registration or modification of photographs, rejecting all requests including photographs of children under the age of 15. The AEPD found these measures insufficient, as a number of profiles under 14 were detected upon inspection. • A telecommunications operator was fined €50,000 by the AEPD in 2011 for amongst other issues, not implementing any protocol for the verification of the identity of potential customers, as they only required an ID number. • A pastries manufacturer was fined €40,901 for, amongst other issues, obtaining minors’ consent for the participation in a promotion without their parents’ authorisation. In order to obtain prizes in the promotion, it was necessary to register personal data on a website. The data requested were the age and e-mail address, which, in the case of minors under 14 years of age, had to be the legal guardian's e-mail address. An e-mail was sent to this e-mail address with the options of (a) data verification or (b) authorisation to register the data of a minor. If the data verification option was chosen, the system allowed the registration of the data without parental authorisation, despite having marked the option under 14 years of age.
Are there specific rules concerning electronic direct marketing to children?
No.
However, as per Law 34/2002 of 11 July 2002 on information society services and electronic commerce, the following requirements apply to all users:
Advertising must be presented as such, in such a way that it cannot be confused with other content, and the advertiser must be clearly identified. If a commercial communication is to be sent to a user (for example, a newsletter with commercial news), the user must have expressly requested or authorised it beforehand.
In any event, the provider must offer the recipient the possibility of objecting to the processing of his or her data for promotional purposes, both at the time of data collection and in each of the commercial communications sent to the user.
The Law also obliges service providers to provide simple and free-of-charge procedures for recipients to revoke the consent they have given, as well as to make information on these procedures available electronically.
These rules also apply to the sending of advertising messages by other equivalent means of individual electronic communication, such as mobile telephone messaging services.
Are there specific rules concerning the use of adtech tracking technologies, profiling and/or online targeted advertising to children?
Yes.
Article 90 of the General Law on Audiovisual Communication (LGCA) prohibits video-sharing platform providers from processing minors’ personal data for commercial purposes, including direct marketing, profiling, or personalised behavioural advertising. See further information here.
See also information on EU-level requirements here.
Are there specific rules concerning online contextual advertising to children?
Yes.
Law 34/1988 of 11 November 1988, General Law on Advertising
Advertising aimed at minors that incites them to purchase a good or service by exploiting their inexperience or credulity, or in which they appear to persuade parents or guardians to purchase, is prohibited. Children may not, without good reason, be presented in dangerous situations. They must not be misled as to the characteristics of the products, their safety, or the capacity and skills necessary for the child to use them without causing harm to himself or herself or to third parties.
Law 13/2011, of 27 May, on the regulation of gaming
If the commercial advertisements involve gambling content, this law contains additional specific rules on said communications. Entities that disseminate commercial communications of gambling operators on social networks with a user profile may only do so on those social networks that have (i) instruments to prevent these communications from being addressed to minors, (ii) mechanisms for blocking or hiding pop-up advertisements by their users, and (iii) tools to segment the audience to which these commercial communications are addressed.
Law 13/2022 of 7 July on General Audiovisual Communication
Audiovisual commercial communications shall not cause physical, mental or moral harm to minors or engage in any of the following conduct: (i) directly inciting minors to purchase or hire products or services by taking advantage of their inexperience or credulity; (ii) directly encouraging minors to persuade their parents or third parties to purchase advertised goods or services; (iii) exploiting the special relationship of trust that minors have with their parents, teachers, or other persons, such as professionals in children's programmes or fictional characters; (iv) showing, without justified reasons, minors in dangerous situations; (v) inciting behaviour that favours discrimination between men and women; (vi) inciting violent behaviour towards minors, as well as towards minors towards themselves or others, or promoting stereotypes based on sex, race or ethnic origin, nationality, religion or belief, disability, age or sexual orientation; and (vii) promote the cult of the body and the rejection of self-image through audiovisual commercial communications of slimming products, surgical interventions or aesthetic treatments, which espouse social rejection due to physical appearance or to success due to weight or aesthetic factors.
Audiovisual commercial communications about products especially aimed at minors, such as toys, shall not mislead as to their characteristics, their safety, or the capacity and ability necessary for minors to use them without causing harm to themselves or to third parties, nor shall they reproduce sexist stereotypes.
Audiovisual commercial communication for cigarettes and other tobacco products, including electronic cigarettes and their refill containers, and herbal products for smoking, as well as the companies that produce them, are prohibited, as are audiovisual commercial communication of alcoholic beverages specifically aimed at minors, or depicting minors consuming such beverages.
Additionally, it must be taken into account that personal data of children collected or otherwise generated by video-sharing platform service providers may not be processed for commercial purposes, such as direct marketing, profiling or personalised behavioural advertising.
Law 13/2011, of 27 May, on the regulation of gaming
Commercial communications are considered contrary to the principle of the protection of minors and are prohibited if they:
- Directly or indirectly incite minors to gamble, by themselves or through third parties.
- Are, by virtue of their content or design, rationally and objectively likely to attract the attention or particular interest of minors, including brand mascots or ringtones specifically or principally aimed at minors.
- Exploit the special relationship of trust that minors have with their parents, teachers, or other persons.
- Use the image, voice or other characteristics inherent to minors or persons characterised as minors.
- Present gambling as a sign of maturity or as an indication of the passage to adulthood.
- Are disseminated or used in media, programmes or supports, whatever they may be, intended specifically or principally for minors.
- Are inserted in applications, web pages or digital content specifically or principally aimed at minors, or in conjunction with links to web pages aimed at minors.
- Are disseminated or displayed inside or outside cinemas or other spaces intended for the public, when these spaces are used for screenings of cinematographic works or theatrical or musical performances to which minors may have access.
- Are broadcast or shown inside or outside stadiums, halls or sports venues, when events or competitions whose participation is restricted exclusively to minors are held there.
- Concern betting on events in which participation is exclusively restricted to minors.
Entities disseminating audiovisual commercial communications of gambling operators in video-sharing platform services may only do so where the providers of such services have (i) mechanisms in place to prevent such communications from being addressed to minors; (ii) mechanisms in place to block or hide pop-up advertisements from their users; and (iii) time slots restriction capabilities (as gambling advertisements are restricted to 1-5am).
The accounts or channels from which programmes or videos available through a video-sharing platform are offered may only provide audiovisual commercial communications from gambling operators when their main activity consists of offering information or content on gambling activities. They also have to use all the mechanisms available on the video-sharing platform to prevent minors from accessing their account or channel and to disseminate safe gambling messages on that account or channel on a regular basis.
Additionally, see information on the EU-level requirements here.
Has there been any regulatory enforcement action concerning advertising to children? In your answer, please include information on the volume, nature and severity of sanctions.
Yes.
A leading Spanish media group that operates in television, radio, and digital media, was fined €103,300 in early 2025 by the National Commission on Markets and Competition (CNMC) for broadcasting advertisements for preservatives and sex toys during a children's program. The CNMC detected six commercials for a sex products brand aired between 7:00 and 10:00 a.m., a time slot when minors are likely to be watching. The initial sanction was set at €172,248, but the media group received a 40% reduction after acknowledging responsibility and paying the fine in advance. Despite this, the company retains the right to challenge the decision through a contentious-administrative appeal before the National Court within two months from the date of notification.
At what age does a person acquire contractual capacity to enter into an agreement to use digital services?
Full capacity to act is obtained at eighteen years of age (articles 315 and 322 of the Spanish Civil Code).
However, Law 26/2015, of July 28, 2015, amending the system for the protection of children and adolescents incorporates an exception to the inability of unemancipated minors to give consent regarding ‘goods and services of ordinary life appropriate to their age in accordance with social customs’.
Do consumer protection rules apply to children?
Yes.
Royal Legislative Decree 1/2007, of November 16, 2007, approving the revised text of the General Law for the Defense of Consumers and Users is fully applicable to children.
Additionally, see information on the EU-level requirements here.
Are there any consumer protection rules which are specific to children only?
No.
There are no consumer protection rules which are specific to children only, although Royal Decree-Law 1/2021, of January 19, on the protection of consumers and users in situations of social and economic vulnerability, includes children among the vulnerable groups due to their greater sensitivity to advertising and aggressive commercial practices, by reinforcing the clarity, correctness and comprehensiveness of the product information.
Additionally, the Draft Organic Law regulates access to loot boxes on videogames and imposes stricter obligations on digital service providers.
Additionally, see information on the EU-level requirements here.
Has there been any regulatory enforcement action concerning consumer protection requirements and children’s use of digital services?
No.
Regulatory enforcement action concerning children’s use of digital services mainly concerns the following: • A Spanish social media company (which was one of the major social media players in Spain in the beginning of the 2000s), after meeting with the Spanish Data Protection Authority, Agencia Española de Protección de Datos (AEPD) in 2009, presented to the AEPD the measures adopted to remove children under 14 from its social network. The company implemented a process of purging profiles of minors under 14: in cases where profiles appear to be under said age, the company sent them a request to provide a photocopy of their ID card or passport within 92 hours. If no reply was received, the company deleted the user’s profile. The AEPD positively valued this initiative and urged the company to extend the verification and checking system to all new registrations suspected of being under 14 years of age. • However, a website owner was fined €30,601 for, amongst other issues, not adequately preventing the signing up of minors on its social page. Allegedly, various checks were performed almost daily to authorise all applications for the registration or modification of photographs, rejecting all requests including photographs of children under the age of 15. The AEPD found these measures insufficient, as a number of profiles under 14 were detected upon inspection. • A telecommunications operator was fined €50,000 by the AEPD in 2011 for amongst other issues, not implementing any protocol for the verification of the identity of potential customers, as they only required an ID number.
Are there any age-related restrictions on when children can legally access online/ digital services?
Yes.
Full capacity to act is obtained at eighteen years of age (articles 315 and 322 of the Spanish Civil Code).
However, Law 26/2015, of July 28, 2015, amending the system for the protection of children and adolescents incorporates an exception to the inability of unemancipated minors to give consent regarding ‘goods and services of ordinary life appropriate to their age in accordance with social customs’.
Additionally, the Preliminary Draft Organic Law for the Protection of Minors in Digital Environments will introduce stricter age verification systems requirements for platforms with potentially harmful content, once enacted.
Are there any specific requirements relating to online/ digital safety for children?
Yes.
Organic Law 8/2021, of 4 June, on the comprehensive protection of children and adolescents against violence (Law 8/2021), includes provisions related to online content that may harm minors. This law establishes a specific duty to report the existence of online content that constitutes a form of violence or abuse against children and adolescents, whether or not it qualifies as a criminal offence.
Given the particular risks associated with the internet and social media, the law also imposes obligations on public authorities to promote safe internet use, raise awareness of digital risks, and encourage the exchange of information, knowledge, experiences, and best practices related to child protection online. Under this legal framework, any individual or legal entity that becomes aware of online content constituting violence against a minor is legally required to report it to the relevant authorities. If the content may constitute a criminal offence, it must also be reported to law enforcement agencies, the Public Prosecutor’s Office, or the judiciary.
Law 8/2021 specifically addresses cases of cyberbullying, grooming, gender-based cyber violence, and sexting, as well as the access and consumption of pornography by minors.
Furthermore, the law mandates that the relevant regulatory agency ensures the availability of a secure and accessible reporting channel for online content that seriously infringes the right to personal data protection.
Additionally, see information on the EU-level requirements here.
Are there specific age verification/ age assurance requirements concerning access to online/ digital services?
Yes.
There is no specific legislation, but the Spanish Data Protection Authority, Agencia Española de Protección de Datos’ has published guidance, its ‘Decalogue of principles: Age verification and protection of minors from inappropriate content’. This contains several principles:
- The system for protecting minors from inappropriate content must ensure that minors cannot be identified, tracked or traced via the Internet.
- Age verification should be oriented towards persons of appropriate age proving their status as "persons authorized to access", and should not allow the accreditation of the status of "minors".
- Accreditation for access to inappropriate content should be anonymous to Internet service providers and third parties.
- The obligation to provide proof of "authorized person" status shall be limited to inappropriate content only.
- The age verification must be done in a certain way and the age categorized as "person authorized to access".
- The system must ensure that individuals cannot be profiled based on their navigation.
- The system must guarantee that a person's activity is not linked between different services.
- The system must guarantee the exercise of parental authority by parents.
- Any system for the protection of minors from inappropriate content must guarantee the fundamental rights of all persons in their access to the Internet.
- Any system for the protection of minors from inappropriate content must have a defined governance framework.
According to a 2020 report by the Spanish Data Protection Authority, Agencia Española de Protección de Datos (AEPD), there is no evidence that editors and content publishers in Spain use any effective method to verify the age of majority of users, beyond requests to the user to confirm their age (by web forms and similar mechanisms). The deliberate use of the term “effective” by the AEPD might imply that the sole request to the user to introduce his/her birthdate in a form may not be considered as “reasonable efforts” to verify the user’s age.
The AEPD however considers that there are solutions on the market that “would be a major step forward in terms of proactive responsibility on the part of data controllers”, such as AgeID, AgeChecked, AgePass, and Yoti, among others. These are third-party services that verify the identity and/or age of the user by means of facial analysis technology or a document such as a passport or driving licence. Once the age of majority is verified, the personal information is encrypted or destroyed, so that the only information retained and shared is if the user is of legal age or not.
In addition, the Draft Organic Law will introduce new age verification, AI-generated content regulations, and gaming protective measures to ensure the safety of minors online, once passed.
Additionally, see information on the EU-level requirements here.
Are there requirements to implement parental controls and/or facilitate parental involvement in children’s use of digital services?
Yes.
The guidance of the Spanish Data Protection Authority, Agencia Española de Protección de Datos’, in its ‘Decalogue of principles: Age verification and protection of minors from inappropriate content’ [see details here] refers to the principle that systems for age verification and protection of minors from inappropriate content should guarantee the exercise of parental authority by parents (Principle 8).
Additionally, see information on the EU-level requirements here.
Has there been any regulatory enforcement action concerning online/ digital safety?
Yes.
- A Spanish social media company (which was one of the major social media players in Spain in the beginning of the 2000s), after meeting with the Spanish Data Protection Authority, Agencia Española de Protección de Datos (AEPD) in 2009, presented to the AEPD the measures adopted to remove children under 14 from its social network. The company implemented a process of purging profiles of minors under 14: in cases where profiles appear to be under said age, the company sent them a request to provide a photocopy of their ID card or passport within 92 hours. If no reply was received, the company deleted the user’s profile. The AEPD positively valued this initiative and urged the company to extend the verification and checking system to all new registrations suspected of being under 14 years of age.
- However, a website owner was fined €30,601 for, amongst other issues, not adequately preventing the signing up of minors on its social page. Allegedly, various checks were performed almost daily to authorise all applications for the registration or modification of photographs, rejecting all requests including photographs of children under the age of 15. The AEPD found these measures insufficient, as a number of profiles under 14 were detected upon inspection. A telecommunications operator was fined €50,000 by the AEPD in 2011 for amongst other issues, not implementing any protocol for the verification of the identity of potential customers, as they only required an ID number.
Are there any existing requirements relating to children and AI in your jurisdiction?
Yes.
There are no national-specific laws in this regard; the AI Act (Regulation 2024/1689) applies in this context. See EU-level response here.
Are there any upcoming requirements relating to children and AI in your jurisdiction?
Yes.
The Spanish government approved the Draft Organic Law for the Protection of Minors in Digital Environments (Draft Organic Law) in 2024. This legislation aims to safeguard children's rights in the digital sphere, particularly their right to privacy, honour, self-image, and protection of personal data. It also seeks to ensure access to age-appropriate content.
The law introduces measures to educate minors and their families about digital risks, impose sanctions for rights violations—such as the dissemination of AI-generated images—and establish obligations for major digital operators and influencers to uphold children's rights. It applies to AI-driven technologies, including chatbots, AI-enhanced video games, and social media platforms that use AI for content moderation, recommendations, and user interactions.
Regarding content regulation, manufacturers of digital devices must provide information, in accessible language, about the risks associated with harmful content. They are also required to incorporate free parental control features, which must be activated during the initial device setup. Additionally, the law restricts children’s access to digital mechanisms such as loot boxes in games, which may incorporate AI elements.
AI-generated content is specifically addressed, with deepfakes and AI-created child exploitation material classified as criminal offences. The legislation is expected to introduce stricter age verification systems for online services, requiring platforms using AI for content creation, filtering, and recommendation to prevent minors from accessing harmful material. Moreover, the law anticipates the integration of AI-based age verification tools that meet stringent child protection standards.
To implement these objectives, the legislation focuses on consumer and user protection, education, healthcare, and the public sector as follows:
- Data Protection: The law amends Organic Law 3/2018 on the Protection of Personal Data and Digital Rights, raising the minimum age for accessing certain digital services from 14 to 16. It also mandates that digital device manufacturers provide information on data protection and privacy risks.
- Audiovisual regulation: Amendments to Law 13/2022 on Audiovisual Communication require audiovisual service providers to include a direct link to the website of the regulatory authority, which will oversee age verification systems. Obligations regarding harmful content are also extended to prominent digital services, depending on whether their service is linear or on-demand.
- Criminal Code amendments:
- Deepfakes and ultra-realistic AI-manipulated media are formally recognised as a form of crime.
- Online restraining orders now extend to prohibiting offenders from accessing certain digital spaces, including social media and messaging platforms.
- Aggravated penalties are introduced for cyber grooming, particularly when perpetrators use false identities.
- Stricter regulations are placed on child sexual exploitation material, including cases where such content is made available to groups likely to include minors.
4. Education and Healthcare:
- Digital literacy, privacy awareness, and media competency training will be promoted in schools.
- The use of mobile and digital devices will be regulated in educational institutions at all levels.
- Healthcare measures will be introduced for early detection of behavioural changes and health issues related to digital overuse.
5. Public Sector:
- Public-private collaboration will be encouraged, with the Ministry for Digital Transformation promoting self-regulation and industry codes of conduct.
- Safeguards will be introduced in public spaces like schools and libraries to protect minors from inappropriate content.
- A National Strategy for the Protection of Children and Adolescents in the Digital Environment will be developed and approved every three years, in coordination with regional and local authorities.
While still undergoing formal approval, this law represents a significant step towards stricter regulation of AI-driven digital environments and stronger protections for minors online.
Additionally, see EU-level response here.
Has there been any other regulatory enforcement activity to date relevant to children’s use of digital services?
Yes.
The Spanish Data Protection Authority, Agencia Española de Protección de Datos investigated the children’s use of a popular games app in 2018 but found no wrongdoing on the company's side.
Are there any other existing or upcoming requirements relevant to children’s use of digital services?
Yes.
See information here on the Draft Organic Law for the Protection of Minors in Digital Environments (Draft Organic Law).
Further, Spain has previously indicated that it would be legislating loot boxes by means of a preliminary draft law regulating random reward mechanisms associated with interactive leisure software products. This regulation is expected to restrict access to loot boxes to children under 18 years. Platforms containing loot boxes will be obliged to verify at the time of payment, by means of ID or biometric identification, whether the account belongs to a minor or not.
Additionally, see information on the EU-level requirements here.