Which topic would you like to know more about?
What is the legal age of adulthood/ majority in your jurisdiction? Are all persons below this age considered a child/ minor?
The legal age of majority is 18.
The Online Safety Act (2021) (Cth) and Privacy Act 1988 (Cth) both define a child as being an individual who has not reached 18 years.
Guidance from the Office of the Australian Information Commissioner provides that if it is not practicable to determine competency on a case-by-case basis, an organisation can assume an individual over the age of 15 has capacity to consent to their own privacy decisions unless there is reason to suspect otherwise.
Has the UNCRC been directly incorporated into national law in your jurisdiction?
Yes.
Australia ratified the UNCRC in December 1990. The UNCRC has not been fully incorporated into Australian law. However, the UNCRC has been incorporated by reference into the federal Family Law Act 1975 (Cth) requiring courts to apply the UNCRC as an interpretive aid when examining the rights of children. The UNCRC can be taken into account when interpreting other laws in some circumstances.
Is there an ombudsperson/ commissioner for children in your jurisdiction?
Yes.
The National Children’s Commissioner oversees the rights of children in Australia. Several other commissioners also consider the rights of children under their purview, including the e-Safety Commissioner and the Office of the Australian Information Commissioner.
If there is an ombudsperson/ commissioner for children in your jurisdiction, do they have any responsibility for upholding children’s rights in the digital world or does the relevant regulator have sole competence?
Yes.
The National Children’s Commissioner is responsible for the promotion of the rights and welfare of children and promoting awareness of the UNCRC rights and principles. The e-Safety Commissioner is Australia’s national independent regulator for online safety. The e-Safety Commissioner accepts complaints from young Australians who experience serious cyberbullying, and for Australians wishing to report illegal online content. The e-Safety Commissioner is the regulator that enforces the age restrictions on social media use. For further information, see the response to this question. The Information Commissioner has competency for the development and enforcement of the Children’s Online Privacy Code. For further information, see the response to this question.
Is there any standalone requirement to collect the consent of one or both parents when processing a child’s personal data (irrespective of any other obligations, e.g. the requirement to have a legal basis for processing)?
No.
There is no specific requirement; however, valid consent under the Privacy Act 1988 (Cth) requires an individual to have capacity. Depending on the age of the child and the complexity of the processing they may not be able to provide consent where it is unclear if they have the capacity to understand and therefore consent to that processing (e.g. when processing sensitive information).
The Office of the Australian Information Commissioner guidance provides that if it is not practicable to determine competency on a case-by-case basis, an organisation can assume an individual over the age of 15 has capacity to consent to their own privacy decisions unless there is reason to suspect otherwise.
At what age can children legally consent to the processing of their own personal data, such that parental permission/ consent is not required?
A child is defined under both the Privacy Act 1988 (Cth) (Privacy Act) and Online Safety Act 2021 (Cth) (Online Safety Act) as being a person under 18 years of age.
Regulatory guidance provides that if it is not practicable to determine competency on a case-by-case basis, an organisation can assume an individual over the age of 15 has capacity to consent to their own privacy decisions unless there is reason to suspect otherwise. An individual aged under 15 is presumed not to have capacity to consent. The Guidance also specifies that capacity to consent is assessed on a case-by-case basis. The Children’s Online Privacy Code (Code) that is to the be developed by the Information Commissioner is expected to draw on similar regulatory instruments in other jurisdictions, including the similar United Kingdom Age-Appropriate Design Code. While the Code will be developed through a consultation process and the content of the Code is yet to be outlined, it is possible that the Code may require the consent of a parent / guardian where data is to be used for the purpose of profiling persons under 13 years old.
Are there specific requirements in relation to collection and/or verification of parental consent/ permission concerning the processing of a child’s personal data?
No.
More generally however consent must be:
- voluntary;
- specific;
- current; and
- the individual must have capacity to consent.
Consent must either be given expressly (e.g. using an electronic medium or signature) or can be reasonably inferred from the conduct of the individual and the organisation. For the collection, use and disclosure of any sensitive information it is recommended that consent always be express.
Are there any particular information or transparency requirements concerning the processing of children’s personal data?
No.
While there are no specific requirements in relation to children, the Privacy Act Australian Privacy Principle (APP) 1 requires that all organisations manage personal information in an open and transparent manner.
Further organisations must maintain an up-to-date privacy policy and provide collection statements outlining, among other matters:
- the kinds of personal information collected and held by the entity;
- how personal information is collected and held;
- the purposes for which personal information is collected, held, used and disclosed;
- how an individual may access their personal information and seek its correction;
- how an individual may complain if the entity breaches the APPs; and
whether the entity is likely to disclose personal information to overseas recipients, and if so, the countries in which such recipients are likely to be located if it is practicable to specify those countries in the collection statement and policy.
Can children directly exercise their rights in relation to their personal data without the involvement of their parents?
Yes.
Guidance from the Office of the Australian Information Commissioner provides that if it is not practicable to determine competency on a case-by-case basis, an organisation can assume an individual over the age of 15 has capacity to consent to their own privacy decisions unless there is reason to suspect otherwise.
Can children make complaints on their own behalf directly to your national data protection/ privacy regulator(s)?
Yes.
The Office of the Australian Information Commissioner (OAIC) does not provide any guidance on children making complaints directly, although this is not expressly forbidden. The OAIC also allows an authorised representative to act on behalf of a complainant.
Children can report serious online abuse and/or illegal and restricted online content to the e-Safety Commissioner. The e-Safety Commissioner will seek consent to contact their parent(s)/guardian(s) in relation to the report. If the child is in danger the e-Safety Commissioner will contact the child’s parent(s)/guardian(s).
Are there any particular requirements/ prohibitions related to:
a. processing specific types of children’s personal data;
b. carrying out specific processing activities involving children’s personal data; and/ or
c. using children’s personal data for specific purposes.
No.
The Privacy Act 1988 (Cth) does not differentiate between the information of children and adults.
The Office of the Australian Information Commissioner (OAIC) has recognised that certain categories of information, including financial information and information about individuals experiencing domestic violence may require higher levels of care when handling due to its sensitive nature. It’s likely the OAIC would also expect additional care to be taken when handling the data of children.
The Information Commissioner is developing a Children’s Online Privacy Code to come into effect by December 2026, which is likely to include additional requirements/prohibitions relating to children’s personal data. For further information, see the response to this question.
Has there been any enforcement action by your national data protection/ privacy regulator(s) concerning the processing of children’s personal data? In your answer, please include information on the volume, nature and severity of sanctions.
No.
Please see the response to this question.
Are there specific rules concerning electronic direct marketing to children?
No.
There are no specific rules, however, under the Spam Act 2003 (Cth) a business must not send or cause to be sent electronic marketing communications (e.g. SMS or emails) without the consent of the recipient.
Similarly, under the Privacy Act Australian Privacy principle 7, a business must not use personal information for direct marketing unless the individual has consented or would reasonably expect the business to use their information for those purposes.
Are there specific rules concerning the use of adtech tracking technologies, profiling and/or online targeted advertising to children?
No.
While there are no current laws on this point, under the upcoming reforms to the Privacy Act 1988 (Cth), the government has agreed in principle that direct marketing to individuals under the age of 18 should be prohibited unless the information used for direct marketing was collected directly from the child and the direct marketing is in the child’s best interests.
Are there specific rules concerning online contextual advertising to children?
No.
Contextual advertising that does not involve the collection, use, or disclosure of personal information is not regulated by the Privacy Act.
However, in terms of general requirements, the advertising industry in Australia is subject to a self-regulation system of codes, developed by the Australian Association of National Advertisers. This includes the Children’s Advertising Code (effective from 1 December 2023) which purports to ensure that advertisers and marketers maintain a high sense of social responsibility when advertising and marketing to children in Australia and includes requirements to ensure that advertising to children:
- does not contravene Prevailing Community Standards (as determined by the Ad Standards Community Panel), including by promoting products or services unsuitable or hazardous to children or encouraging unsafe practices or encouraging bullying or unhealthy body image (s 2.1);
- is not misleading or deceptive to children (s 2.2);
- does not employ sexual appeal, include sexual imagery or state or imply that children are sexual beings and that ownership or enjoyment of a Product will enhance their sexuality (s 2.3);
- does not portray unreasonably frightening or distressing images or events (s 2.4);
- does not undermine the authority, responsibility or judgment of parents or carers (s 2.5);
- does not contain an appeal to children to urge their parents, carers or another person to buy the advertised product or service for them (s 2.5);
- does not state or imply that the advertised product or service makes children who own or enjoy it superior to their peers (s 2.5);
- does not state or imply that persons who buy the advertised product or service are more generous than those who do not (s 2.5);
- anything which is offered either free, at a reduced price, or with an additional cost and which is conditional upon the purchase of an advertised product must not encourage the purchase of an excessive quantity or irresponsible consumption (s 2.6); and
- which uses popular personalities or celebrities (live or animated) to endorse, recommend, promote or advertise or market products, services or anything which is offered either free, at a reduced price, or with an additional cost and which is conditional upon the purchase of an advertised product must be clearly distinguishable as advertising (s 2.7).
The Food and Beverages Advertising Code also regulates the targeting of children (defined as those under the age of 15).
Has there been any regulatory enforcement action concerning advertising to children? In your answer, please include information on the volume, nature and severity of sanctions.
No.
There has been no enforcement specifically concerning advertising to children.
The Australian operates under a self-regulatory system. Since the 2023 Children’s Advertising Code came into effect, it has been reported that there have been about 200 complaints about advertising targeting children. There has been no enforcement specifically concerning advertising to children.
At what age does a person acquire contractual capacity to enter into an agreement to use digital services?
Generally, under common law a contract with a person under the age of 18 years is voidable.
There are limited exceptions to this set out under common law and specific state and territory laws (e.g. Minors (Property and Contacts) Act 1970 (NSW)).
Do consumer protection rules apply to children?
Yes.
The Australian Consumer Law contains, in schedule 2 of the Competition and Consumer Act 2010 (Cth), a number of consumer guarantees and prohibitions on unfair contract terms and misleading and deceptive conduct which apply to all Australians broadly.
Are there any consumer protection rules which are specific to children only?
No.
Consumer protections apply broadly to all Australians. There are some limited exceptions, for example in relation to safety standards for certain children’s products such as toys for children under the age of 3 and regulations around the use of button batteries.
Has there been any regulatory enforcement action concerning consumer protection requirements and children’s use of digital services? In your answer, please include information on the volume, nature and severity of sanctions.
No.
There has been no enforcement specifically concerning consumer protection in relation to children.
Are there any age-related restrictions on when children can legally access online/ digital services?
Yes.
Under Part 4A of the Online Safety Act 2021 (Cth) (OSA), ‘age restricted social media platforms’ (ARSMPs) must take reasonable steps to prevent users under 16 from having accounts which can access their platform or parts of their platform. An ARSMP is defined as:
1. an electronic service that satisfies the following conditions:
a. the sole purpose, or a significant purpose, of the service is to enable online social interaction between 2 or more end-users;
b. the service allows end-users to link to, or interact with, some or all of the other end-users;
c. the service allows end-users to post material on the service; and
d. such other conditions (if any) as are set out in the legislative rules; or
2. an electronic service specified in the legislative rules.
The OSA also excludes certain electronic services from the scope of ARMSPs, namely:
1. services where none of the material on the service is accessible to, or delivered to, one or more end-users in Australia – while not stated, this could in our view be achieved by way of geo-blocking access;
2. where the service is specified in the legislative rules.
The Online Safety (Age-Restricted Social Media Platforms) Rules 2025 clarifies the scope of ARMSPs, and excludes the following services whose sole or primary purpose is to:
- enable communication by means of messaging, email, voice or video calling;
- enable users to play online games with other users ;
- allow users to share information about products or services, such as reviews, technical support or advice;
- allow users to engage in professional networking or development;
- support the education of users; and
- support the health of users.
It also excludes services that have a significant purpose of facilitating communication between:
- educational institutions and students or students’ families; and
- providers of health care and people using those providers’ services.
On 16 September 2025, the eSafety Commissioner issued the Social Media Minimum Age Regulatory Guidance which provides guidance on what constitutes “reasonable steps” to prevent users under the age of 16 from having accounts.
Under Part 4A of the Online Safety Act 2021 (Cth) (OSA), ‘age restricted social media platforms’ (ARSMPs) must take reasonable steps to prevent users under 16 from having accounts which can access their platform or parts of their platform. An ARSMP is defined as:
1. an electronic service that satisfies the following conditions:
a. the sole purpose, or a significant purpose, of the service is to enable online social interaction between 2 or more end-users;
b. the service allows end-users to link to, or interact with, some or all of the other end-users;
c. the service allows end-users to post material on the service; and
d. such other conditions (if any) as are set out in the legislative rules; or
2. an electronic service specified in the legislative rules.
The OSA also excludes certain electronic services from the scope of ARMSPs, namely:
1. services where none of the material on the service is accessible to, or delivered to, one or more end-users in Australia – while not stated, this could in our view be achieved by way of geo-blocking access;
2. where the service is specified in the legislative rules.
The Online Safety (Age-Restricted Social Media Platforms) Rules 2025 clarifies the scope of ARMSPs, and excludes the following services whose sole or primary purpose is to:
· enable communication by means of messaging, email, voice or video calling;
· enable users to play online games with other users ;
· allow users to share information about products or services, such as reviews, technical support or advice;
· allow users to engage in professional networking or development;
· support the education of users; and
· support the health of users.
It also excludes services that have a significant purpose of facilitating communication between:
- educational institutions and students or students’ families; and
- providers of health care and people using those providers’ services.
On 16 September 2025, the eSafety Commissioner issued the Social Media Minimum Age Regulatory Guidance which provides guidance on what constitutes “reasonable steps” to prevent users under the age of 16 from having accounts.
In addition, on 9 October 2025, the Office of the Australian Information Commissioner (OAIC) released the Privacy Guidance on Part 4A of the Online Safety Act 2021 (Cth), which provides guidance on how privacy obligations under the Privacy Act 1988 (Cth) and the APPs apply to entities implementing the Social Media Minimum Age (SMMA) scheme.
A breach by a provider will be subject to a maximum penalty of 30,000 penalty units (currently equivalent to AU $9.9 million). This increases to 150,000 penalty units (currently equivalent to AU $49.5 million) if the provider is a body corporate. The obligation to take reasonable steps came into force on 10 December 2025.
The Basic Online Safety Expectations (BOSE) created under the Online Safety Act 2021 (Cth) applies to online service providers and sets out core expectations in relation to e-safety. Note that the BOSE are not themselves enforceable, and a failure to meet specific expectations will not trigger penalties for non-compliance. However, the e-Safety Commissioner is empowered to require regulated service providers to report on how they are meeting any or all of the expectations, either on a non-periodic or a periodic basis through a reporting notice or determination and can publicise instances of non-compliance with the Expectations. Financial penalties apply for any failure by a service provider to comply with the reporting notice or determination. Core Expectation 11 requires that providers take reasonable steps to prevent access by children to class 2 material (class 2 material includes content that is likely to be classified as X18+ or R18+ and includes, for example, pornography and other high impact material such as R18+ video games). Examples of reasonable steps that could be taken to meet this Expectation provided by the BOSE include:
- implementing appropriate age assurance mechanisms;
- conducting child safety risk assessments; and
- continually seeking to develop, support or source, and implement improved technologies and processes for preventing access by children to class 2 material.
Industry codes and standards, which apply to social media services, app distribution services, hosting services, internet carriage services, equipment providers, search engine services and relevant electronic services and designated internet services contain measures to address different classes of online material. Split into two phases, phase 1 codes cover ‘class 1A’ and ‘class 1B’ online material. These classes cover the most seriously harmful online content, such as child sexual exploitation material and pro-terror material. Registered in 2025, phase 2 codes cover class 1C and class 2 material such as online pornography that is inappropriate for children.
Are there any specific requirements relating to online/ digital safety for children?
Yes.
The e-Safety Commissioner, under the Online Safety Act 2021 (Cth) (the Act) oversees and enforces: cyber bullying schemes for children; adult cyber abuse schemes; seriously harmful online content, such as child sexual exploitation material and pro-terror material; the Basic Online Safety Expectations and the industry codes and standards developed under the Act.
The phase 2 industry codes require relevant sections of the online industry (see question 21) to implement certain measures in order to ensure children only access age-appropriate content, including:
- conducting self-assessment of risk profiles to determine compliance obligations;
- implementing systems and technologies to detect, flag and remove age-inappropriate content;
- implementing age assurance mechanisms to prevent children from accessing age-inappropriate content. Some examples are given, such as AI-recognition, ID/Digital ID verification and credit card checks;
- implementing end-user reporting mechanisms and controls to allow Australian end-users to report breaches to the service provider. It also requires platforms to educate Australian end-users on the role and functions of the e-Safety Commissioner and how to make a complaint; and
- considering the impact of AI chatbots on end users and use a safety by design approach to developing these features. This may include ensuring appropriate age assurance measures and proactive reporting to the e-Safety Commissioner where there are significant changes to the feature.
Are there specific age verification/ age assurance requirements concerning access to online/ digital services?
Yes.
Please see the response to this question.
The Australian government has also conducted a trial of age-verification programs that will restrict children’s exposure to inappropriate online content, including pornography and potentially social media. On 1 September 2025, the final report of the Age Assurance Technology Trial was published which found that while no single solution fits all contexts, a wide variety of technologies already meet meaningful thresholds for accuracy, security and privacy when carefully selected and implemented.
[B&BD1]Note to web design team: Hyperlink to question 21 to be inserted here.Please see the response to this question.
Are there requirements to implement parental controls and/or facilitate parental involvement in children’s use of digital services?
Yes.
Services falling under the scope of the phase 2 industry codes, which cover age-inappropriate content (see this question for more information about the phase 2 industry codes), are encouraged to implement parental control mechanisms to filter inappropriate materials to ensure they cannot be accessed by child users/accounts. This is particularly the case for equipment providers and account-based services.
Furthermore, under the Basic Online Safety Expectations (BOSE), services are expected to take reasonable steps to ensure that end‑users are able to use the service in a safe manner. The BOSE also expects that where a service is likely to be accessed by a child, the service will take reasonable steps to ensure that the best interests of the child are a primary consideration in the design and operation of any service.
The BOSE also recommends that it may be reasonable to ensure that the default privacy and safety settings of a children’s service are robust and set to the most restrictive level.
Has there been any regulatory enforcement action concerning online/ digital safety? In your answer, please include information on the volume, nature and severity of sanctions.
Yes.
In 2025, the e-Safety Commissioner commenced enforcement processes against two companies by issuing formal warnings:
- to ‘Bad Kitty’s Dad, LDA’, which operates OmeTV, for allegedly breaching the Relevant Electronic Services Industry Standard by failing to have required safety features and settings, and for allowing adults to have randomised video chats with children without sufficient protections; and
- to an unnamed UK-based technology company for failing to place appropriate safeguards to prevent the creation of child sexual exploitation material, in breach of industry standards.
The e-Safety Commissioner has also issued multiple infringement notices to companies for failing to respond to transparency notices seeking information about steps taken to address any terrorist and violent extremist materials, as well as child sexual exploitation. materials.
The eSafety Commissioner has also proactively used its information gathering powers under sections 63G of the Online Safety Act 2021 (Cth) to obtain information about compliance with Social Media Minimum Age (SMMA) obligations ahead of them coming into force on 10 December 2025.
Are there any existing requirements relating to children and AI in your jurisdiction?
No.
N/A
Are there any upcoming requirements relating to children and AI in your jurisdiction?
No.
N/A
Has there been any other regulatory enforcement activity to date relevant to children’s use of digital services?
No
N/A
Are there any other existing or upcoming requirements relevant to children’s use of digital services?
Yes.
The Privacy Act 1988 (Cth) (Privacy Act) is currently undergoing significant reforms.
Children’s Online Privacy Code
The first tranche of those reforms has been passed and requires the Information Commissioner to develop a Children’s Online Privacy Code (Code) prior to 10 December 2026. The coming obligations on social media platforms and internet service providers have not been published (as of December 2025). However, the Office of the Australian Information Commissioner (OAIC) has stated that the Code will set out how designated entities must comply with the Australian Privacy Principles (APPs) in a manner consistent with the safety of those under 18. The OAIC has specifically foreshadowed that the Code will provide for a fair and reasonable test with respect to the collection, use and disclosure of personal information that requires consideration of the data subjects as being children.
The OAIC has indicated the Code may require that privacy policies and notices directed to children be clear and understandable, for example, by using graphics, video and audio content as supplementary material.
The OAIC has flagged that its intention is to draw on United Kingdom Age-Appropriate Design Code in developing the Code. (For more information on the UK Age Appropriate Design Code please see the UK guide here ).
The OAIC has completed phase 2 initial consultations with industry stakeholders, civil society and academia and published submissions it received in September 2025. It has indicated that a draft of the Code will be open for public consultation in early 2026. The Code will be a non-legislative instrument applies to providers of certain online services that are likely to be accessed by children (but not including health services). Breach of the Code will expose regulated entities to liability for:
- Significant penalties for serious invasions of privacy, being the greater of AU $50 million, three times the value of the benefit obtained, or, if that value cannot be determined, 30% of the entities’ adjusted turnover during the relevant period;
- 10,000 penalty units (currently AU $3.3 million) for interferences of privacy not deemed serious;
- 1,000 penalty units (currently AU $330,000) for a breach not deemed to be invasive of privacy;
- 60 penalty units (currently AU $19,800) for bodies corporate and 200 penalty units (currently AU $66,000) for listed corporations issued an infringement notice in respect of non-compliance with the APPs; and
- 1,000 penalty units (currently AU $330,000) for failure to comply with a compliance notice, which differs from an infringement notice in that a compliance notice requires an entity to take steps to remedy a purported breach of the APPs.
Statutory Tort for Invasions of Privacy
The reforms also introduced a statutory tort for invasions of privacy into Australian law, which provides for unlimited liability for economic loss, and liability for non-economic loss and exemplary damages limited to AU $478,550 and the similar maximum amount for non-economic loss allowed under Australian defamation law.
The tort will be made out where:
- the defendant invaded the plaintiff’s privacy by intruding upon their seclusion or misusing information relating to the plaintiff; and
- the plaintiff had a reasonable expectation of privacy; and
- the invasion was intentional or reckless; and
- the invasion was serious; and
- there was no countervailing public interest justifying the invasion.
In determining whether an expectation of privacy was reasonable, a plaintiff’s age will be a relevant consideration, as will whether any digital tool was used to facilitate a purported invasion.
In relation to children’s data, the government remains in agreement or in-principle in agreement to the following reforms, which may come in the next tranche:
- a suite of proposed additional protections to apply specifically to children including that targeting a child should be prohibited, with an exception for targeting that is in the best interests of the child;
- a prohibition on trading the personal information of children;
- a prohibition on direct marketing to persons under 18 unless the personal information used for direct marketing was collected directly from the child and the direct marketing is in the child’s best interests;
- a requirement that entities have regard to the best interests of the child as part of considering whether collection, use or disclosure of children’s data is fair and reasonable in the circumstances; and that the Privacy Act codify the principle that valid consent must be given with capacity.


