Which topic would you like to know more about?
What is the legal age of adulthood/ majority in your jurisdiction? Are all persons below this age considered a child/ minor?
The Family Law Reform Act 1969 (for England and Wales), the Age of Majority (Scotland) Act 1969, and the Age of Majority Act (Northern Ireland) provide that a person shall attain “full age” or “majority” (i.e. become an adult) at the age of 18.In most states, the age of majority is 18 years old. For purposes of U.S. privacy laws, however, minors can legally consent to the processing of their personal data starting at age 13.
Has the UNCRC been directly incorporated into national law in your jurisdiction?
No.
The UK ratified the UNCRC in December 1991. It was incorporated into Scottish law via the United Nations Convention on the Rights of the Child (Incorporation) (Scotland) Act 2024. However, it has not been incorporated into the national laws of England & Wales or Northern Ireland.
Is there an ombudsperson/ commissioner for children in your jurisdiction?
Yes.
The Children Act 2004 established the role of the Children’s Commissioner in England.
The Commissioner for Children and Young People (Northern Ireland) Order 2003 established the role of the Northern Ireland Commissioner for Children and Young People.
The Children's Commissioner for Wales Act 2001 established the role of Children’s Commissioner for Wales.
The Commissioner for Children and Young People (Scotland) Act 2003 established the role of Children and Young People's Commissioner Scotland.
If there is an ombudsperson/ commissioner for children in your jurisdiction, do they have any responsibility for upholding children’s rights in the digital world or does the relevant regulator have sole competence?
No.
The Children’s Commissioner in England, the Northern Ireland Commissioner for Children and Young People; the Children’s Commissioner for Wales, the Children and Young People’s Commissioner Scotland have broad responsibilities to promote children’s rights and welfare.
Is there any standalone requirement to collect the consent of one or both parents when processing a child’s personal data (irrespective of any other obligations, e.g. the requirement to have a legal basis for processing)?
No.
The UK GDPR applies in a similar fashion to the EU GDPR. Here, the requirement to collect parental consent under Article 8(1) of the UK GDPR only applies where a provider of an information society service offers services directly to a child under the age of 13 and seeks to rely on consent as the legal basis under the UK GDPR for the processing of that child’s personal data. If another legal basis is relied on for processing, there is no requirement to collect parental consent for processing a child’s data.
Note that the ICO has, in enforcement action against a large social media company, indicated in the past that in at least some circumstances, the ability to rely on contractual necessity for processing children’s data may not be available, which might in some circumstances force reliance on consent, and therefore Article 8. For further information, see the response to this question.
At what age can children legally consent to the processing of their own personal data, such that parental permission/ consent is not required?
In England, Wales and Northern Ireland there is no set age at which a child is generally considered to be competent to provide their own consent to processing. The ICO’s guidance outside of Article 8 GDPR refers to a test used to determine capacity to consent in a medical context, known as Gillick competence. This requires the context specific assessment of “the child's maturity and understanding and the nature of the consent required. The child must be capable of making a reasonable assessment of the advantages and disadvantages of the [treatment] proposed/so the consent, if given, can be properly and fairly described as true consent”. While not directly applicable to all data processing, it is perhaps useful to note that under the UK’s Mental Capacity Act 2005, children aged 16 and above are (unless mentally impaired as discussed under the Act) considered to generally have capacity for the purposes of medical treatment. In Scotland, children aged 12 or over are presumed to be of sufficient age and maturity to provide their own consent for data protection purposes, unless the contrary is shown.
Regarding digital consent specifically, Article 8(1) of the UK GDPR provides that the age of digital consent in the UK is 13.
Are there specific requirements in relation to collection and/or verification of parental consent/ permission concerning the processing of a child’s personal data?
No.
Article 8(2) of the UK GDPR does not set out an approved method for the collection of parental consent; however, it does require organisations to “make reasonable efforts” to verify that parental consent is given or authorised by the holder of parental responsibility over the child, taking into account available technology.
No specific method has been endorsed by the Information Commissioner’s Office (ICO) . In its guidance on processing children’s data (Children and the UK GDPR), the ICO notes that what is “reasonable efforts” depends on the risk of the processing and the available technology. Organisations should consider in particular the data minimisation, storage limitation and data security principles when processing data for compliance with the requirements of Art. 8 UK GDPR.
Are there any particular information or transparency requirements concerning the processing of children’s personal data?
Yes.
Article 12(1) UK GDPR emphasises the particular importance of the requirement for clear and plain language when providing information to children.
The Information Commissioner’s Office (ICO) further details this requirement in Standard 4 “Transparency” of its Children’s Code (or Age Appropriate Design Code), a statutory code of practice under s.123 of the Data Protection Act 2018 relating to the processing of children’s data by online service providers. This includes making transparency information easy to find and accessible for both children and parents; providing ‘bite-sized’ explanations at the point of collecting data; providing information in a child-friendly manner (e.g. using video and audio content); tailoring the information to the age of the child, taking into account the developmental needs of children at different ages; and providing clear terms, policies and community standards. As explained in s.127 of the Data Protection Act 2018, failure to comply with the Code does not of itself cause liability before a court or tribunal, but it is admissible in evidence in such proceedings, and the ICO must take its Code into account when seeking to enforce the UK GDPR in the context of relevant services.
Can children directly exercise their rights in relation to their personal data without the involvement of their parents?
Yes.
Yes, once they are considered competent to do so.
In Scotland, children aged 12 or more are presumed to be of sufficient age and maturity to be able to exercise their rights (unless the contrary is shown). This presumption does not apply in England, Wales and Northern Ireland where competence is assessed depending upon the level of understanding of the child; however, the Information Commissioner’s Office’s guidance recognises that the 12-year-old threshold indicates an approach that will be reasonable in many cases. Children should not be considered competent if it is evident they are acting against in their own best interests. See also the comments on capacity in response to this question.
Can children make complaints on their own behalf directly to your national data protection/ privacy regulator(s)?
Yes.
The Information Commissioner’s Office (ICO) states in its guidance that children may exercise their data protection rights, including the right to submit complaints to the ICO, on their own behalf if they are “competent” to do so.
See the response to this question for general comments on competence.
Are there any particular requirements/ prohibitions related to:
a. processing specific types of children’s personal data;
b. carrying out specific processing activities involving children’s personal data; and/ or
c. using children’s personal data for specific purposes.
Yes.
The Information Commissioner’s Office (ICO) Children’s Code states that the ICO considers the use of children’s personal data for marketing purposes, profiling, other automated decision-making, or the offering of online services directly to children as “likely to result in a high risk to data subjects” and, therefore, requiring a data protection impact assessment. It also outlines a list of 15 “standards” controllers should abide by when processing children’s data (e.g. collecting and retaining only the minimum amount of personal data needed to provide the elements of a service the controller knows children are actively engaged with, switching geolocation off by default etc.).
Has there been any enforcement action by your national data protection/ privacy regulator(s) concerning the processing of children’s personal data? In your answer, please include information on the volume, nature and severity of sanctions.
Yes.
The Information Commissioner’s Office (ICO) has only issued one major administrative fine that specifically related to children’s personal data. This was a multi-million £STG fine issued against a social media organisation for breaching the UK GDPR by:
· providing services to UK children (under 13) and processing their personal data without their parents’/carers’ consent or authorisation;
· failing to provide clear and proper information to child users regarding how their personal data would be used, collected and shared in a way that was easy enough for them to understand; and
· failing to ensure UK users’ personal data was processed lawfully, fairly and in a transparent manner.
In 2023, the ICO issued a Preliminary Enforcement Notice in respect of another social media company’s use of an AI chatbot, on the basis of provisional conclusions that that company had not completed a data protection impact assessment (DPIA) in accordance with Article 35 UK GDPR and had not consulted with ICO in respect of unmitigated high risks posed by the chatbot. After dialogue between the ICO & the company and changes by the company to its DPIA, the ICO determined that it was not necessary to serve an Enforcement Notice.
The ICO has also been carrying out an ongoing investigation into various social media and video sharing platforms under the Children’s Code.
The ICO has also issued some reprimands relating to use of children’s data, for example:
- 2024 – reprimand issued to a high school relating to roll-out of facial recognition technology for cashless catering without completing a DPIA;
2023 – reprimand to a primary school about inappropriate disclosure of data (including special category data) in the classroom.
Are there specific rules concerning electronic direct marketing to children?
Yes.
The Privacy and Electronic Communications (EC Directive) Regulations 2003 set out rules regarding unsolicited direct marketing to individuals via electronic mail and in principle require the individual’s consent. These rules apply irrespective of the age of the recipient.
In addition, the Information Commissioner’s Office (ICO) covers direct marketing to children in its Guidance on Children and the UK GDPR. The guidance acknowledges that organisations are not necessarily prevented from using children’s data for marketing purposes; however, they need to ensure they meet all UK GDPR requirements when doing so. The ICO highlights that under the UK GDPR, children merit specific protection with regard to their personal data and that organisations have to consider the risks specific to children and whether appropriate mitigations can be put in place, through a data protection impact assessment. The guidance also directs to sector specific guidance on marketing, such as from the Advertising Standards Authority (ASA), to ensure that children’s personal data is not used in a way that might lead to their exploitation. In this regard, the ASA’s CAP Code also contains rules under chapter 10 which reflect a marketer’s obligation to ensure a lawful basis for processing of children’s personal data in order to send marketing communications.
Are there specific rules concerning the use of adtech tracking technologies, profiling and/or online targeted advertising to children?
Yes.
The general rules around cookies and similar tracking technologies under Privacy and Electronic Communications (EC Directive) Regulations 2003 – which require consent - also apply in respect of children.
Standard 12 “Profiling” of the Information Commissioner’s Office’s Children’s Code is also relevant to profiling children for targeted advertising purposes. This standard requires online service providers to switch options which use profiling “off” by default, unless there is a compelling reason for profiling to be on by default, taking account of the best interests of the child. Profiling should only be allowed when there are appropriate measures in place to protect children from any harmful effects. In addition, the Children’s Code highlights that the best interests of the child must be a primary consideration for online service providers whose services are likely to be accessed by children and requires them to conduct a data protection impact assessment.
Under the Advertising Standards Authority’s CAP Code Rule 10, marketers should avoid using a child’s personal data to create personality or user profiles, particularly in the context of automated decision-making that produces legal effects or similarly significant effects for a child.
Are there specific rules concerning online contextual advertising to children?
No.
The UK GDPR and the Data Protection Act 2018 do not set out rules regarding contextual advertising and, to the extent an activity does not involve the processing of personal data, these laws would not be applicable.
For further information, see the responses in this section.
Has there been any regulatory enforcement action concerning advertising to children? In your answer, please include information on the volume, nature and severity of sanctions.
No.
To date, there has been no enforcement action from either the ICO or the Advertising Standards Authority concerning advertising to children specifically.
For further information, see the responses in this section.
At what age does a person acquire contractual capacity to enter into an agreement to use digital services?
N/A
As summarised by the Information Commissioner’s Office, the legal age of capacity to enter into contracts is 16 in Scotland (with some exceptions which allow contracts with children younger than this). In the rest of the UK there is no definite age at which a child (i.e. under 18s) is considered to have the legal capacity to enter into a contract. The basic rule is that children over the age of 7 are generally able to enter into contracts, but (with some exceptions) the contracts they make may be ‘voidable’. This means that you can’t hold the child to what they have agreed to, or enforce the terms of the contract against them – they can effectively cancel the contract at any time. If the contract is voided, then you do not have a lawful basis for processing their personal data.
Do consumer protection rules apply to children?
Yes.
Consumer protection rules apply to children. For example, the Consumer Protection from Unfair Trading Regulations 2008 (CPRs) apply to consumers of all ages. Under the CPRs, children can be deemed as ‘vulnerable’ consumers on the basis of their age, affording them greater protections than adult consumers.
Are there any consumer protection rules which are specific to children only?
Yes.
There are some consumer protection rules that are specific to children only. For example, the CPRs prohibit advertising direct exhortations to children to buy advertised services/products (or to persuade their parents/other adults to buy advertised services/products for them). The Competition Markets Authority , being the main regulatory body in charge of consumer protection law in the UK, has published guiding principles for online and app-based games. Included in this guidance are rules specific to children only, such as protecting children against being exploited or confused into purchasing in-game items with real world money. Advertising in the UK is primarily regulated by the Advertising Standards Authority (ASA). The ASA enforces both: (i) the CAP Code for non-broadcast advertising; and (ii) the BCAP Code for broadcast advertising. These Codes contain some rules that are specific to children (for example, see CAP Code Section 5 on Children). The ASA also considers that marketers need to be careful that ads which may not be suitable for children are appropriately targeted (for example, not displayed on public billboards or websites without appropriate age restrictions).
Has there been any regulatory enforcement action concerning consumer protection requirements and children’s use of digital services? In your answer, please include information on the volume, nature and severity of sanctions.
No.
The Competition Markets Authority (CMA) does not tend to focus its enforcement strategies specifically on children, but rather will enforce against infringements suffered across a consumer user base. That said, the CMA has engaged in public awareness projects and investigations that would likely impact children, such as the social media endorsement principles and enforcement against games services subscriptions.
Under the Digital Markets Competition and Consumers Act 2024, the CMA will be empowered under a new administrative model to issue decisions as to whether or not consumer protection rules have been infringed. The CMA will also be able to issue fines of:
(i) up to £300,000, or 10% of a businesses’ annual turnover (whichever is higher) for consumer laws infringements; and
up to 5% of a business’s annual global turnover for failing to comply with an undertaking or direction.
Are there any age-related restrictions on when children can legally access online/ digital services?
Yes.
There are laws in the UK which implement age restrictions regarding the sale of certain goods and services. The following is a non-exhaustive list of goods and services to which is illegal to be sold to children:
• gambling
• knives, alcohol, cigarettes, vaping; and
• pornography.
To the extent such goods and services are accessed online by children, such restrictions will generally also apply.
In parallel, the Online Safety Act 2023 regulates certain types of online services and establishes rules around illegal content and content that is harmful to children. See more information here.
Are there any specific requirements relating to online/ digital safety for children?
Yes.
The Online Safety Act 2023 (OSA) introduces duties of care on providers of regulated user-to-user services, regulated search services and services providing pornographic content. The OSA sets out the framework for online safety codes of practice to be established and enforced by the media regulator, Ofcom.
Ofcom has published illegal content codes for user-to-user services and search services. Services were required to conduct their first illegal content risk assessments by 16 March 2025, and the illegal content safety duties became applicable from 17 March 2025. Services were also required to conduct a “children’s access assessment” to determine whether their service is likely to be accessed by children (and therefore needs to comply with the children’s safety duties) by 16 April 2025.
Ofcom has published age assurance guidance for publishers of pornographic content. Age assurance duties – obligations to impose “highly effective age assurance” - for these providers became applicable from 17 January 2025.
Ofcom has published children’s safety codes for user-to-user services and search services likely to be accessed by children (at the time of writing, in draft form subject to Parliamentary approval). The children’s risk assessment deadline is 24 July 2025, and the children’s safety duties are applicable from 25 July 2025.
A timeline for Ofcom’s publication and/or finalisation of other guidance and codes can be found here. Ofcom will also continue to review and iterate the guidance and codes it has already published. A more recent update was also published here. For example, it has recently launched a consultation on new measures that would if imposed extend the illegal content and children’s safety measures underneath the existing code/draft code, including more proactive measures and duties to impose “highly effective age assurance” (see more information here – details published by Ofcom can be found here.
Additional obligations still will apply to certain categorised service providers, including transparency reporting, enhanced risk assessment obligations, requirements relating to disclosure of information about deceased child users and obligations on protecting news content and including additional details on their safety efforts in their terms of service. Categorised services are those who fall into the categories outlined in the relevant Regulations, as determined by Ofcom, but this is at the time of publication these Regulations are undergoing judicial review.
Are there specific age verification/ age assurance requirements concerning access to online/ digital services?
Yes.
Under the Information Commissioner’s Office’s (ICO) Children’s Code (Code) providers of online services likely to be accessed by children have to comply with a set of standards in respect of their child users. The Code provides that online service providers should either establish the age of their users with a level of certainty that is appropriate to the risks from their data processing, in order to effectively apply the Code’s standards to their child users; or apply these standards to all users instead. The Code examines various methods to conduct age assurance. The ICO has also issued an Opinion on age assurance to explain how online services caught by the Children’s Code can use age assurance technology in compliance with data protection law in a risk-based and proportionate way.
In parallel, the Online Safety Act 2023 (OSA) sets out age assurance requirements specifically for the online services it regulates (user-to-user services, search engines, and pornographic content providers).
All in-scope pornographic content providers are required to implement a form of “highly effective age assurance” (HEAA) to prevent children from encountering provider pornographic content.
In addition, for user-to-user services likely to be accessed by children:
· Section 12(4) OSA requires providers of these services to use HEAA to prevent children from encountering “primary priority content that is harmful to children” on the service, where they do not prohibit this kind of content for all users in their terms and conditions. Primary priority content includes pornography or content that encourages, promotes, or provides instructions for self-harm, eating disorders or suicide. Ofcom has produced guidance on this,
· The children’s safety code applicable to user-to-user services likely to be accessed by children also recommends HEAA is implemented in the following scenarios:
- As an access control measure (to prevent access to the whole service): Where the principle purpose of the service is hosting or disseminating primary priority content or “priority content that is harmful to children”, which includes bullying, abusive or hateful content, content which depicts or encourages serious violence or injury, content which encourages dangerous stunts and challenges and content which encourages the ingestion, inhalation or exposure to harmful substances.
- As a content control measure (enabling appropriate moderation of identified harmful content): Where primary priority content, or priority content, is not prohibited in the terms and conditions (or it is prohibited but not currently technically feasible to take down) – but hosting/disseminating this content is not the principal purpose of the service. For priority content, this also only applies where the service is medium or high risk for the type of content in question.
-As a measure enabling the implementation of Ofcom’s recommender system measures: Where primary priority content, priority content (excluding bullying), or “non-designated content” (including body stigma and depression content) is not prohibited in the terms and conditions, the service is medium or high risk for that type of content, and the child-accessible part of the service has a content recommender system. These measures are to ensure services can apply child protection measures such as the removal or downranking of harmful content from recommender feeds.
The measures in Ofcom’s codes are not mandatory; however, compliance with Ofcom’s recommended measures creates a “safe harbour” for meeting relevant OSA safety duties. Services are able to take “alternative measures” provided they document their decision, including any implications on privacy and freedom of expression.
Ofcom is currently consulting on extending the obligations to impose HEAA, which if adopted would include a requirement to impose HEAA on all livestreaming services (to ensure children who are streamers have more limited functionality), and to underpin some measures currently based on any existing age checks under the illegal harms code. See the answer here to see this consultation.
HEAA is not defined in the OSA. Ofcom has published guidance on technologies which in its view are capable of being HEAA, provided certain criteria are met:
Part 3 services: here
Part 5 services: here
Notably, unlike guidance issued in some other territories, facial age estimation is considered capable, where appropriately deployed, of being HEAA.
Are there requirements to implement parental controls and/or facilitate parental involvement in children’s use of digital services?
Yes.
There are no legal provisions requiring the use of parental controls. However, the Information Commissioner’s Office’s Children’s Code (Code) indicates the importance of parental controls in supporting parents in protecting and promoting the best interests of the child. At the same time, the Code also stresses their impact on the child’s rights to privacy, association, play, access to information and freedom of expression under the UNCRC.
The Code recommends notifying children about parental controls when they sign up and each time they use a service. If a service lets parents monitor a child's activity or location, it should clearly indicate to the child when this is happening. Parents should also be informed about the child's privacy rights, explaining why children have the right to know if their online activity is being tracked.
The Online Safety Act 2023 does not mandate the use of parental controls. Parental controls are not recommended by Ofcom as a children’s safety measure in its children’s safety codes of practice at this time. However, Ofcom has said that parental controls can be a “useful tool” to monitor and limit the time children spend online, and will consider the matter for future interactions of its codes. It has not included such measures in its latest consultation.
Has there been any regulatory enforcement action concerning online/ digital safety? In your answer, please include information on the volume, nature and severity of sanctions.
No.
Ofcom has fined a social media company in excess of £1.5 million for not responding accurately to a request for information. However, this does not relate to the Online Safety Act 2023 (OSA) (rather to obligations under the Communications Act 2003). Under OSA, since the first safety duties have come into force, and at the time of writing: • Ofcom has opened an investigation into an imageboard website and others regarding potential non-compliance with illegal content risk assessment and safety duties, and failure to respond to an information request. • Ofcom has opened an enforcement programme regarding the sharing of child sexual abuse material on file-storage and file-sharing services, and has opened investigations into seven such providers. • Ofcom has opened an enforcement programme regarding age assurance in the adult sector, and is investigating several providers regarding their age assurance measures to prevent children accessing pornography on their services. Ofcom previously investigated adult services in respect of similar requirements under the UK’s VSP regime (which is being transitioned to the OSA), and fined a content platform £7k for failure to comply with these requirements; and another content platform £1.05 million for failure to respond accurately to a related information request. • Ofcom has written to “hundreds” of providers of user-to-user services which allow user-generated pornographic content, indicating it will start enforcing age assurance-related OSA children’s safety duties when they come into effect on 25 July 2025.
Are there any existing requirements relating to children and AI in your jurisdiction?
Yes.
Providers should carefully assess whether generative AI features or services are caught directly as search or user-to-user services under the Online Safety Act 2023, or whether they can generate pornographic material (see Ofcom’s open letter to generative AI providers). Services already regulated under OSA should consider how generative AI features should be factored into their risk assessments, or whether they will trigger a new risk assessment as a “significant change” to the service.
Are there any upcoming requirements relating to children and AI in your jurisdiction?
Yes.
Providers should carefully assess whether generative AI features or services are caught directly as search or user-to-user services under the Online Safety Act 2023, or whether they can generate pornographic material (see Ofcom’s open letter to generative AI providers). Services already regulated under OSA should consider how generative AI features should be factored into their risk assessments, or whether they will trigger a new risk assessment as a “significant change” to the service.
Has there been any other regulatory enforcement activity to date relevant to children’s use of digital services? In your answer, please include information on the volume, nature and severity of sanctions.
No.
N/A.
Are there any other existing or upcoming requirements relevant to children’s use of digital services?
Yes.
Ofcom’s timeline for implementing the remaining codes/guidance due to be issued under the Online Safety Act 2023 has been discussed here. It can be expected to continuously consult on its review of harms and measures.
In addition to this, the Information Commissioner’s Office’s (ICO) Children’s code strategy sets out the priority areas social media and video-sharing platforms need to improve on in 2024-2025, as well as how the ICO will continue to enforce the law and drive conformance with the code by industry.
The ICO’s regulatory focus is on default privacy settings, geolocation, profiling children for targeted advertisements, using children’s data in recommender systems and using data of children under 13. The ICO has confirmed that it will continue to collaborate with Ofcom in the UK and with international counterparts.