India

Global Comparative Review

Start reading

Which topic would you like to know more about?

Children's rights

Learn more

Data protection and privacy

Learn more

Electronic direct marketing and advertising

Learn more

Consumer protection

Learn more

Online Digital Safety

Learn more

Artificial intelligence

Learn more

Other issues relating to children's use of digital services

Learn more

Children's rights

Back to top

What is the legal age of adulthood/ majority in your jurisdiction? Are all persons below this age considered a child/ minor?

Section 31(1) of the Majority Act, 1875 provides that every person domiciled in India shall attain the age of majority upon completing the age of 18 years.

Has the UNCRC been directly incorporated into national law in your jurisdiction?

Yes.

While the UNCRC has not been directly incorporated into one specific Indian law, there are multiple provisions in the Indian Constitution and various domestic laws that have incorporated the principles laid down in the UNCRC:

  • The Majority Act, 1875
  • The Guardianship and Wards Act, 1890
  • The Indian Partnership Act, 1932
  • Hindu Marriage Act, 1955
  • The Hindu Minority and Guardianship Act, 1956
  • The Hindu Adoption and Maintenance Act of 1956
  • Bharatiya Nagarik Suraksha Sanhita, 2023
  • Bonded Labour System (Abolition) Act 1976
  • The Child and Adolescent Labour (Prohibition and Regulation) Act, 1986
  • SC/ST (Prevention of Atrocities) Act, 1989
  • The Prohibition of Child Marriage Act, 2006
  • The Right of Children to Free and Compulsory Education Act (RTE), 2009
  • The Protection of Children from Sexual Offences (POCSO) Act, 2012
  • The National Food Security Act, 2013
  • Rights of Persons with Disabilities Act, 2016
  • Juvenile Justice (Care and Protection of Children) Act, 2015
  • The Maternity Benefit Amendment Act, 2017
  • Bharatiya Nyaya Sanhita, 2023

Is there an ombudsperson/ commissioner for children in your jurisdiction?

Yes.

The National Commission for Protection of Child Rights was established pursuant to Section 3(1) of the Commissions for Protection of Child Rights Act, 2005.

If there is an ombudsperson/ commissioner for children in your jurisdiction, do they have any responsibility for upholding children’s rights in the digital world or does the relevant regulator have sole competence?

Yes.

The National Commission for Protection of Child Rights’ (NCPCR) jurisdiction includes several obligations that also extend to the protection of children’s rights in the digital world, as set out in Sections 13, 14 and 15 of the Commissions for Protection of Child Rights Act, 2005.

The NCPCR is responsible, among other things, for reviewing the legal safeguards for protecting child rights, inquiring into violations of child rights and recommending initiation of proceedings to relevant authorities, taking suo moto cognisance of any deprivation of child rights or non-implementation of children-related laws and other functions as it may consider necessary for the promotion of child rights. All these duties and functions also pertain to protecting child rights in the online/digital space.

The NCPCR does not oust the jurisdiction of relevant regulators/authorities and can only make recommendations (e.g. initiation of legal proceedings) after conducting an inquiry, relating to children’s rights to such regulators/authorities or approach the higher judiciary for seeking the necessary orders/directions, etc.

Data protection and privacy

Back to top

Is there any standalone requirement to collect the consent of one or both parents when processing a child’s personal data (irrespective of any other obligations, e.g. the requirement to have a legal basis for processing)?

No, under SPDI Rules.

Yes, under the DPDP Act.

The Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 (SPDI Rules) read with the Information Technology Act, 2000 is the current law governing data protection in India. These rules do not specifically require the collection of consent of a parent when processing a child’s personal data. However, principles of contract law would govern the processing of the personal data of a child, particularly in the context of entering into terms and conditions or terms of service for digital/online products and services. As per Section 11 of the Indian Contract Act, 1872, only a major can enter into a contract. Accordingly, consent for processing the personal data of a child and agreeing to terms and conditions or terms of service in the online/digital sphere on behalf of the child will have to be obtained from the parent.

However, the SPDI Rules will soon be replaced by the Digital Personal Data Protection Act, 2023 (DPDP Act) which has been introduced but is yet to come into force. Pursuant to Section 9(1)the DPDP Act, the requirement of obtaining ‘verifiable consent of a parent’ applies when a Data Fiduciary is relying on consent as the legal basis for processing the personal data of a child. However, the manner of obtaining such verifiable consent from the parent will be prescribed by the Central Government through delegated legislation or ‘rules’ which will be introduced in due course.

On 3 January 2025, the Ministry of Electronics and Information Technology released a Draft of the Digital Personal Data Protection Rules, 2025 (Draft DPDP Rules) for public consultation. At this stage, these are a draft version of the rules which will be finalised post the stakeholder consultation process and become effective only upon their notification in the official gazette. Accordingly, the Draft DPDP Rules proposed will likely undergo changes before they are finalised and notified by the government.

Having said the above, Proposed Rule 10(1) of the Draft DPDP Rules propose that a Data Fiduciary is required to adopt technical and organisational measures to ensure that verifiable consent of a parent is obtained before the processing of personal data of a child. The term technical and organisational measures has not been defined. The Draft DPDP Rules further propose that Data Fiduciaries are required to observe due diligence and ensure that the person identifying as the parent is an identifiable adult. This can be done by referring to reliable age and identity details of the parent available with the Data Fiduciary or voluntarily provided age and identity details provided by the parent or a virtual token mapped to the same issued by an entity entrusted by law or the Government responsible for the issuance and maintenance of such details including a Digital Locker service provider.

At what age can children legally consent to the processing of their own personal data, such that parental permission/ consent is not required?

No, under SPDI Rules.

Yes, under specific circumstances under the DPDP Act.

The Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 do not stipulate any age when children can legally consent to the processing of their personal data.

While the Digital Personal Data Protection Act, 2023 (DPDP Act) requires verifiable consent from a parent before processing any personal data of a child, the Central Government, if it is satisfied that a Data Fiduciary has ensured that its processing of personal data of children is done in a manner that is verifiably safe, can notify for such processing by such Data Fiduciary the age of a child above which the Data Fiduciary shall be exempt from the applicability of obtaining verifiable consent of the parent.

Further, the Central Government can notify purposes and classes of Data Fiduciaries that can be exempted from seeking verifiable consent, subject to conditions prescribed by it.

The Draft Digital Personal Data Protection Rules, 2025 do not specify any age above which a Data Fiduciary would be exempt from the applicability of the requirement of obtaining verifiable consent of the parent.

Are there specific requirements in relation to collection and/or verification of parental consent/ permission concerning the processing of a child’s personal data?

No, under SPDI Rules.

Yes, under the DPDP Act.

There is no such requirement under the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011.

Section 9(1) of the Digital Personal Data Protection Act, 2023 (DPDP Act) requires a Data Fiduciary to obtain verifiable consent of the parent. The manner of obtaining verifiable consent will be prescribed by the Central Government through delegated legislation/rules.

Proposed Rule 10(1) of the Draft Digital Personal Data Protection Rules, 2025 (Draft DPDP Rules) proposes that technical and organisational measures should be adopted for ensuring that verifiable consent of the parent is obtained prior to processing of a child’s personal data and observe due diligence for checking that the person identifying as the parent is an identifiable adult by either referring to the reliable age and identity details available with the Data Fiduciary or voluntarily provided age and identity details by the parent or a virtual token mapped to the same issued by an authorised entity/Government.

Are there any particular information or transparency requirements concerning the processing of children’s personal data?

Yes, under both SPDI Rules and the DPDP Act.

Since there is no specific provision governing the processing of the personal data of a child under the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 (SPDI Rules), the general information and transparency requirements including taking reasonable steps to provide knowledge of the information being collected, purpose of collection etc. would apply in the event that a child’s personal data is being processed. [Rule 5, SPDI Rules]

The principle of transparency is enshrined and reflected in various provisions of the DPDP Act (Sections 5, 6 and 11). This includes providing notice to Data Principals informing them about inter alia the personal data being collected, the purpose of collection, manner of exercising rights etc., providing the option to access notice and request for consent in 23 languages, right to access information about personal data being processed including information about the identities of other Data Fiduciaries and Data Processors with whom the personal data has been shared etc.

Can children directly exercise their rights in relation to their personal data without the involvement of their parents?

No, under both SPDI Rules and the DPDP Act.

The Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 do not have any provisions that allow or disallow children from directly exercising their rights about their personal data.

Under the Digital Personal Data Protection Act, 2023 (DPDP Act), the general requirement is that verifiable consent of a parent is required before processing the personal data of children, which practically entails the exercise of rights will be carried out by the parent on behalf of the child. However, in certain Central Government notified exemptions, where verifiable consent of a parent will not be required for a child above the age that has been notified, in such cases the rights will practically have to be exercised by the child themselves. Further clarity in this regard will be available in due course when jurisprudence on this subject matter develops.

Can children make complaints on their own behalf directly to your national data protection/ privacy regulator(s)?

No, under both SPDI Rules and the DPDP Act.

The Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 do not allow a child to make complaints on their behalf directly.

Under the Digital Personal Data Protection Act, 2023 (DPDP Act), a Data Principal (which in the case of a child includes the parent) can approach the Data Protection Board of India (Board) for its grievances. However, the Board will be a quasi-judicial body with powers of (for certain functions) or akin to a civil court as per the DPDP Act. Therefore, while the DPDP Act does not specifically clarify this, in civil matters before judicial/quasi-judicial bodies, such as matters before the Board, a proceeding can only be instituted on behalf of a minor by a parent/guardian. (Sections 13 & 28, DPDP Act read with Order XXXII, Code of Civil Procedure 1908)

Are there any particular requirements/ prohibitions related to:

a. processing specific types of children’s personal data;

b. carrying out specific processing activities involving children’s personal data; and/ or

c. using children’s personal data for specific purposes.

No, under SPDI Rules.

Yes, under the DPDP Act.

There are no specific provisions under the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 concerning the processing of children’s personal data.

The Digital Personal Data Protection Act, 2023 (DPDP Act) provides the following requirements/ prohibitions in relation to children’s personal data:

a) processing specific types of children’s personal data

Section 9(1) of the DPDP Act states that a Data Fiduciary is required to obtain ‘verifiable consent’ of a parent for processing any type of children’s personal data. The manner of obtaining the verifiable consent will be prescribed by the Central Government through rules.

b) carrying out specific processing activities involving children’s personal data;

Section 9(2) of the DPDP Act states that the Data Fiduciary shall refrain from processing personal data that is likely to cause any detrimental effect on the well-being of a child.

Further, Section 9(3) of the DPDP Act states that the Data Fiduciary shall refrain from tracking or behavioural monitoring of children or targeted advertising directed at children

c) using children’s personal data for specific purposes

Section 9(4) of the DPDP Act provides that the Central Government may notify the purposes and classes of Data Fiduciaries that may be exempted from seeking verifiable consent or refraining from tracking or behavioural monitoring of children or targeted advertising directed at children, subject to the conditions prescribed by it in the rules to the DPDP Act.

The Draft Digital Personal Data Protection Rules, 2025 (Draft DPDP Rules) propose the requirement of obtaining verifiable consent from the parent/guardian, and the prohibition from undertaking tracking or behavioural monitoring of children or targeted advertising directed at children. However, per Proposed Rule 11, these restrictions will not apply to entities (and for corresponding purposes) as identified in the Fourth Schedule of the Draft DPDP Rules.

Under the Fourth Schedule of the Draft DPDP Rules, certain entities and individuals are exempt from the above requirements if their data processing is limited to specific purposes. These entities and the corresponding purposes to which the processing is restricted have been identified below:

  • clinical establishments, mental health establishments, healthcare professionals, if processing is limited to provision of health services to the extent necessary for the protection of health
  • allied healthcare professionals, if processing is limited to supporting implementation of any healthcare treatment and referral plan recommended by such professional to the extent necessary for the protection of health
  • educational institutions, if processing is limited to tracking and behavioural monitoring for the educational activities or in the interests of safety of enrolled children
  • caregivers for infants and children in crèches or daycare centres, if processing is limited to tracking and behavioural monitoring in the interests of safety of children entrusted in the care of such institution, crèche or centre
  • individuals engaged by these institutions for transporting children if processing is limited to tracking the location of such children, in the interests of their safety, during the course of their travel.

Further, Section 9(5) provides that the Central Government, if it is satisfied that the Data Fiduciary has ensured that the processing of personal data of children is done in a “verifiably safe” manner, can also notify the age of children lower than 18. Data Fiduciaries can process children’s data above the age that has been notified and be exempt from the applicability of both or either of the requirements relating to seeking verifiable consent and/or restriction on tracking or behavioural monitoring of children or targeted advertising directed at children.

Has there been any enforcement action by your national data protection/ privacy regulator(s) concerning the processing of children’s personal data? In your answer, please include information on the volume, nature and severity of sanctions.

No, under both SPDI Rules and the DPDP Act.

There have not been any enforcement actions concerning the processing of children’s personal data under the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011.

Since the Digital Personal Data Protection Act, 2023 has not yet come into force, there has not been any enforcement action taken by the Data Protection Board of India (which is also yet to be established under the Act) concerning the processing of personal data of children.

Electronic direct marketing and advertising

Back to top

Are there specific rules concerning electronic direct marketing to children?

Yes.

Any electronic direct marketing involving the processing of children’s personal data requires compliance with the Digital Personal Data Protection Act, 2023 (DPDP Act). Section 9 of the DPDP Act imposes requirements pertaining to obtaining parental consent, prohibits undertaking tracking or behavioural monitoring of children or targeted advertisements directed at children and prohibits processing personal data that is likely to have a detrimental effect on a child.

Additionally, the Consumer Protection Act, 2019 is the primary legislation governing matters of consumer protection in India and has established the Central Consumer Protection Authority (CCPA) to regulate matters relating to false or misleading advertisements that are prejudicial to the interests of the public and consumers, among other things.

The CCPA has issued the Guidelines for Prevention of Misleading Advertisements and Endorsements for Misleading Advertisements, 2022 which regulates all kinds of advertisements (regardless of form, format or medium) and specifically includes ‘children targeted advertisements’.

For further information, see the responses in this section.

Are there specific rules concerning the use of adtech tracking technologies, profiling and/or online targeted advertising to children?

Yes.

Sections 9(2) and 9(3) of the Digital Personal Data Protection Act, 2023 (DPDP Act) prohibit the Data Fiduciary from undertaking the processing of personal data that is likely to cause any detrimental effect on the well-being of a child. The DPDP Act, subject to exemptions notified by the Central Government, also prohibits tracking/behavioral monitoring of children or targeted advertising directed at children.

The Central Government is empowered by Section 9(4) to notify purposes and classes of Data Fiduciaries that can be exempted from seeking verifiable parental consent, or from the restriction on tracking or behavioural monitoring of children or targeted advertising directed at children subject to conditions prescribed by it.

Further, under Section 9(5), the Central Government, if it is satisfied that the Data Fiduciary has ensured that the processing of personal data of children is done in a “verifiably safe” manner, can also notify the age of children lower than 18. Data Fiduciaries can process children’s data above the age that has been notified and be exempt from the applicability of both or either of the requirements relating to seeking verifiable consent and/or restriction on tracking or behavioural monitoring of children or targeted advertising directed at children.

Under the Fourth Schedule of the draft Digital Personal Data Protection Rules, 2025, certain entities and individuals are exempt from the requirements of obtaining verifiable consent and the prohibition from tracking or behavioural monitoring of children or targeted advertising if their data processing is limited to specific purposes as identified in the Fourth Schedule. These entities and the corresponding purposes to which the processing is restricted have been identified below:

  • clinical establishments, mental health establishments, healthcare professionals, if processing is limited to provision of health services to the extent necessary for the protection of health
  • allied healthcare professionals, if processing is limited to supporting implementation of any healthcare treatment and referral plan recommended by such professional to the extent necessary for the protection of health
  • educational institutions, if processing is limited to tracking and behavioural monitoring for the educational activities or in the interests of safety of enrolled children
  • caregivers for infants and children in crèches or daycare centres, if processing is limited to tracking and behavioural monitoring in the interests of safety of children entrusted in the care of such institution, crèche or centre
  • individuals engaged by these institutions for transporting children if processing is limited to tracking the location of such children, in the interests of their safety, during the course of their travel.

Furthermore, an online intermediary is required to undertake due diligence obligations under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 and to make reasonable efforts by itself and to cause the users of its computer resource not to host, display, upload, modify, publish, transmit, store, update or share any information that is inter alia harmful to a child, or an online game that causes user harm (including any harm to children).

Are there specific rules concerning online contextual advertising to children?

Yes.

Contextual advertising that does not or is not based on the processing of personal data of children would not be regulated under the Digital Personal Data Protection Act, 2023. Section 9(3) does prohibit Data Fiduciaries from undertaking targeted advertisement to children; however, while the language of this provision is broad, this prohibition will likely only apply in the event children’s personal data is processed for the purposes of advertising targeted at children. However, there will be more clarity when the jurisprudence on this aspect develops.

Also, the Guidelines for Prevention of Misleading Advertisements and Endorsements for Misleading Advertisements, 2022 provides specific restrictions and conditions on children-targeted advertisements which would also include contextual advertisements directed at children.

Has there been any regulatory enforcement action concerning advertising to children? In your answer, please include information on the volume, nature and severity of sanctions.

No.

There have been no regulatory enforcement actions concerning children-related advertisements undertaken by authorities under the Digital Personal Data Protection Act, 2023 (as the Act is yet to be enforced), Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, or Guidelines for Prevention of Misleading Advertisements and Endorsements for Misleading Advertisements, 2022.

Consumer protection

Back to top

At what age does a person acquire contractual capacity to enter into an agreement to use digital services?

As per the Indian Contract Act 1872, a person is competent to contract if such person inter alia has reached the age of majority per the Majority Act 1875 which defines the age of majority as 18 years.

Accordingly, a person acquires the contractual capacity to enter into an agreement to use digital services at the age of 18 years.

Do consumer protection rules apply to children?

Yes.

The Consumer Protection Act, 2019 (CPA) is the central legislation governing consumer protection in India that also regulates advertisements and unfair trade practices among other things.

The provisions of the CPA and the rules and guidelines issued thereunder apply to children as well. As per the definition of a ‘complainant’ under Section 2(5) of the CPA, a parent would be considered a complainant if the consumer is a minor.

Accordingly, a complaint regarding the violation of any provision of the CPA in the case of a consumer being a minor can only be instituted by a parent.

Additionally, the CPA prohibits ‘misleading advertisement’, which is defined under Section 2(28) of the CPA as meaning an advertisement that falsely describes such product or service or gives a false guarantee to or is likely to mislead the consumers as to the nature, substance, quantity or quality of such product or service, is considered an unfair trade practice under the CPA, or deliberately conceals important information.

To prevent false or misleading advertisements and endorsements, the Central Consumer Protection Authority (CCPA), established under the CPA, has issued the Guidelines for Prevention of Misleading Advertisements and Endorsements for Misleading Advertisements, 2022 (Misleading Ads Guidelines). Guideline 8 of the Misleading Ads Guidelines stipulates conditions for advertisements that address, target, or use children. For further information, see the response to this question.

Are there any consumer protection rules which are specific to children only?

Yes.

The Guidelines for Prevention of Misleading Advertisements and Endorsements for Misleading Advertisements, 2022 (Misleading Ads Guidelines) specifically concern the issue of advertisements addressing, targeting or using children.

The Guidelines 8(1) and 8(2) of the Misleading Ads Guidelines mandate that children-targeted advertisements should not do the following, among other things:

  • condone, encourage, inspire or unreasonably emulate behaviour that could be dangerous for children;
  • take advantage of children's inexperience, credulity or sense of loyalty;
  • exaggerate the features of goods, products or services in such a manner as to lead children to have unrealistic expectations of such goods, products or services;
  • condone or encourage practices that are detrimental to children's physical health or mental wellbeing;
  • imply that children are likely to be ridiculed or made to feel inferior to others or become less popular or disloyal if they do not purchase or make use of such goods, product or service;
  • include a direct exhortation to children to purchase any goods, product or service or to persuade their parents, guardians or other persons to purchase such goods, product or service for them;
  • use qualifiers such as ‘just’ or ‘only’ to make the price of goods, products or services seem less expensive where such advertisement includes additional cost or charge
  • feature children for advertisements prohibited by any law for the time being in force, including tobacco or alcohol-based products;
  • feature personalities from the field of sports, music or cinema for products which under any law requires a health warning for such advertisement or cannot be purchased by children;
  • make it difficult for children to judge the size, characteristics and performance of advertised products and to distinguish between real-life situations and fantasy;
  • exaggerate what is attainable by an ordinary child using the product being marketed;
  • exploit children’s susceptibility to charitable appeals; however they shall explain the extent to which their participation will help in any charity-linked promotions;
  • resort to promotions that require a purchase to participate and include a direct exhortation to make a purchase addressed to or targeted at children;
  • claim that consumption of a product advertised shall have an effect on enhancing intelligence or physical ability or bring exceptional recognition without any valid substantiation or adequate scientific evidence;
  • claim any health or nutritional claims or benefits without being adequately and scientifically substantiated by a recognized body;
  • be published in any mass media, including advertisements on network games in respect of medical services, drugs, dietary supplements, medical instruments, cosmetic products, liquor or cosmetic surgery which are adverse to the physical and mental health of children;
  • be such as to develop negative body image in children; or
  • give any impression that such goods, products or services are better than the natural or traditional food that children may be consuming.

Further, Guidelines 8(3) and 8(40 set out that an advertisement for junk foods must not be advertised during a program meant for children or on a channel meant exclusively for children and advertisements offering promotional gifts to persuade children to buy goods, products or services without necessity or promote illogical consumerism must be discouraged.

Has there been any regulatory enforcement action concerning consumer protection requirements and children’s use of digital services? In your answer, please include information on the volume, nature and severity of sanctions.

No.

While there has not been any regulatory enforcement action concerning consumer protection requirements and children’s use of digital services, in 2021, the Ministry of Electronics and Information Technology (MeitY) issued an advisory to parents and teachers on children’s safe online gaming.

While dealing with the risk posed by the ‘Blue Whale Challenge Game’, the MeitY had issued an advisory noting the game is abetting suicide and had outlined some risk mitigation measures for parents. In 2017, the Supreme Court of India, in Case no. W.P. (C) No.943/2017) asked the Central Government to ban a game due to the risk posed by it, however, Central Government informed the court of its inability to block it as there were no downloadable applications of the game.

Online Digital Safety

Back to top

Are there any age-related restrictions on when children can legally access online/ digital services?

Yes.

There, are age related restrictions, both directly or by extension, under various statutes, rules and regulations.

For access to certain commodities and services, there are national as well as state laws in India that implement age-related prohibitions. Such prohibitions apply to inter alia the following:

  • Consumption of alcohol: The legal age of drinking in India varies from State to State and there is an absolute prohibition placed on advertisements of alcohol (irrespective of any age-related requirements). Advertisers are accordingly not permitted to promote directly or indirectly the production, sale or consumption of cigarettes, tobacco products, wine, alcohol, liquor or other intoxicants. (Rule 7, Cable Television Networks Rules, 1994 read with Guideline 9, the Guidelines for Prevention of Misleading Advertisements and Endorsements for Misleading Advertisements, 2022)
  • Cigarettes and other tobacco products: The Cigarettes and other Tobacco Products (Prohibition of Advertisements and Regulations of Trade and Commerce, Production, Supply and Distribution) Act, 2003 prohibits the sale of cigarettes or other tobacco products to a person below the age of 18 years and in an area within a radius of one hundred yards of any educational institution to safeguard the interest of children. (Section 6, The Cigarettes and other Tobacco Products (Prohibition of Advertisements and Regulations of Trade and Commerce, Production, Supply and Distribution) Act, 2003)

In addition to the above, the Prohibition of Electronic Cigarettes (Production, Manufacture, Import, Export, Transport, Sale, Distribution, Storage and Advertisement) Act, 2019 prohibits the production, manufacturing, import, export, transport, sale, distribution, advertisement, storage of electronic cigarettes. (Sections 4 & 5, Prohibition of Electronic Cigarettes (Production, Manufacture, Import, Export, Transport, Sale, Distribution, Storage and Advertisement) Act, 2019)

  • Sexually explicit content and child pornography: Further, the Information Technology Act, 2000 penalises the transmission of obscene and sexually explicit content online, along with child pornography, imposing severe penalties for offenses involving sexually explicit material featuring children. (Section 67, 67A, and 67B, Information Technology Act, 2000)

Similarly, the Protection of Children Sexual Offences Act, 2012 (POCSO Act) protects children from exploitation by pornography. It mandates stringent punishment for anyone involved in using children for pornographic purposes, with imprisonment ranging from 5-7 years. (Section 14, POCSO Act)

Further, any failure to report/destroy child pornographic material whilst storing or possessing such pornography with an intent to transmit the same is punishable with a term of imprisonment extending up to 5 years & 7 years (in some cases). (Section 15, POCSO Act)

  • Online Gambling: Some states have introduced laws to prohibit gambling and regulate online games. For e.g., in Tamil Nadu, the Tamil Nadu Prohibition of Online Gambling and Regulation of Online Games Act, 2022 seeks to establish Tamil Nadu Online Gaming Authority, which is given the power to make regulations on age restrictions (Section 3 read with Section 5, The Tamil Nadu Prohibition of Online Gambling and Regulation of Online Games Act, 2022)

There are also specific requirements for permissible online real money games under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (Intermediary Guidelines) such as the online real money game does not involve wagering on any outcome, the provisions of any law relating to the age at which an individual is competent to enter into a contract, etc. (Rule 4A (3), Intermediary Guidelines)

  • Further, while these are not directly applicable to a child, the Intermediary Guidelines require an online gaming self-regulating body to have a framework for verifying an online real money game which should include measures to safeguard children against user harm, parental or access control and classifying online games through age-rating mechanism, based on the nature and type of content. (Rule 4A(8)(b) & 4A(8)(c), Intermediary Guidelines)
  • Content Rating: The Intermediary Guidelines require an over-the-top (OTT) platform to classify content which it transmits, publishes or exhibits on the basis of nature and type of content, as U, U/A 7+, U/A 13+, U/A 16+ & A. For content classified as U/A 13+ or higher, the OTT platform has to ensure access control mechanisms such as parental locks are made available. Similarly, in the case of content classified with an ‘A’ rating, the OTT platform has to implement a reliable age verification mechanism for viewership of such content & take all measures to restrict access to such content by a child through the implementation of appropriate access control measures. (Rule 9 & Para II(B) & II(D), Appendix, Intermediary Guidelines)

Are there any specific requirements relating to online/ digital safety for children?

Yes.

There are no formal or statutory meanings ascribed to the terms “digital safety” and “online safety for children” in India. However, specific requirements are put in place through various legislations to ensure the online/digital safety of children.

For instance, Section 67B of the Information Technology Act, 2000 prescribes the punishment for publishing or transmitting material depicting children in sexually explicit act, etc., in electronic form.

Rule 3(1)(b) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (Intermediary Guidelines) requires an intermediary to make reasonable efforts by itself and to cause its users to not host, display, upload, modify, publish, transmit, store, update or share any information that inter alia is harmful to a child, or is obscene, pornographic, paedophilic, invasive of another’s privacy, or an online game that causes user harm etc.

The Intermediary Guidelines further state in Rule 4A(8) that an online gaming self-regulating body shall have a framework for verifying an online real money game which should include inter alia:

  • the safeguards against user harm, including self-harm and psychological harm;
  • the measures to safeguard children, including measures for parental or access control and classifying online games through age-rating mechanism, based on the nature and type of content; and
  • the measures to safeguard users against the risk of gaming addiction, financial loss and financial fraud, including repeated warning messages at higher frequency beyond a reasonable duration for a gaming session and provision to enable a user to exclude himself upon user-defined limits being reached for time or money spent.

The Protection of Children from Sexual Offences Act, 2012 (POCSO Act) is specific legislation for protecting children from offences of sexual assault, sexual harassment and pornography. The definition of ‘child pornography’ in Section 2(1)(da) includes a visual depiction of sexually explicit conduct involving a child in a digital or computer-generated image.

Further, pursuant to Section 11 of the POCSO Act, a person is considered to be engaging in sexual harassment of a child when such person with sexual intent inter alia

  • repeatedly or constantly follows or watches or contacts a child either directly or through electronic, digital or any other means; or
  • threatens to use, in any form of media, a real or fabricated depiction through electronic, film or digital or any other mode, of any part of the body of the child or the involvement of the child in a sexual act.

Moreover, Rule 11 of the Protection of Children from Sexual Offences Rules, 2020 imposes an obligation on any person who has received any pornographic material involving a child or any information regarding such pornographic material being stored, possessed, distributed, circulated, transmitted, facilitated, propagated or displayed, or is likely to be distributed, facilitated or transmitted in any manner shall report the contents to the local police or Special Juvenile Police Unit or the cybercrime portal.

For further information see the responses in this section.

Are there specific age verification/ age assurance requirements concerning access to online/ digital services?

No, under current law.

Yes, under the DPDP Act

Currently, there are no explicit statutory provisions under Indian law except under the upcoming Digital Personal Data Protection Act, 2023 (DPDP Act), which establishes a specific requirement to implement age assurance/ age verification measures.

Section 9(1) of the DPDP Act requires a Data Fiduciary to obtain verifiable consent of a parent for processing the personal data of children.

The Draft Digital Personal Data Protection Rules, 2025 (Draft DPDP Rules) do not propose any age specific verification/age assurance requirements concerning access to online/digital services for the child.

However, per proposed Rule 10(1), the Draft DPDP Rules require a Data Fiduciary to observe due diligence for checking that the individual identifying herself as the parent is an identifiable adult by referring to the reliable age and identity details available with the Data Fiduciary or the voluntarily provided age and identity details provided by the parent or a virtual token mapped to the same issued by the authorised entity/Government.

Moreover, per proposed Rule 11 (read in conjunction with the Fourth Schedule), the Draft DPDP Rules exempt certain entities (namely, clinical establishments, mental health establishments, healthcare professionals, allied healthcare professionals, educational institutions, caregivers for infants and children in crèches or daycare centres, and individuals engaged by these institutions for transporting children) from obtaining verifiable consent for the corresponding purposes, including providing health services, supporting healthcare treatments and recommended referral plans, tracking and monitoring behaviour for educational purposes, ensuring the safety of enrolled children, and tracking the location, as identified in the Fourth Schedule of the Draft DPDP Rules.

Further, under Rule 3(1)(b) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (Intermediary Guidelines) an intermediary is required to make reasonable efforts by itself, and to cause the users of its computer resource to not host, display, upload, modify, publish, transmit, store, update or share any information that inter alia is harmful to a child or is obscene, pornographic, paedophilic, invasive of another’s privacy or an online game that causes user harm etc.

The Intermediary Guidelines require an over the top (OTT) platform to classify content which it transmits, publish or exhibits on the basis of nature and type of content, as U, U/A 7+, U/A 13+, U/A 16+ & A. For content classified as U/A 13+ or higher, the OTT platform has to ensure access control mechanisms such as parental locks are made available. Similarly, in the case of content classified with an ‘A’ rating, the OTT platform has to implement a reliable age verification mechanism for viewership of such content & take all measures to restrict access to such content by a child through the implementation of appropriate access control measures. (Rule 9 & Para II(B) & II(D), Appendix, Intermediary Guidelines)

Similarly, for selling substances like cigarettes online, the service provider has to ensure the buyer is 18 or above to ensure compliance with Section 6 of the Cigarettes and Other Tobacco Products (Prohibition of Advertisement and Regulation of Trade and Commerce, Production, Supply and Distribution) Act, 2003.

Are there requirements to implement parental controls and/or facilitate parental involvement in children’s use of digital services?

Yes.

While there is no specific legislation for implementing parental control or facilitating parental involvement in children’s use of digital services, provisions under various laws for any use of digital services requires parental involvement. This also stems from the fact that under Indian law, a child cannot enter into a contract. Accordingly, any access or use of digital services requires parental involvement.

Further, Rule 4A(8)(c) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (Intermediary Guidelines) require an online gaming self-regulating body to provide a framework for verifying an online real money game which must include measures for parental or access control and age-rating mechanism, based on the nature and type of content for classification of any game.

The Intermediary Guidelines require an over the top (OTT) platform to classify content which it transmits, publishes or exhibits on the basis of nature and type of content, as U, U/A 7+, U/A 13+, U/A 16+ & A. For content classified as U/A 13+ or higher, the OTT platform has to ensure access control mechanisms such as parental locks are made available. Similarly, in the case of content classified with an ‘A’ rating, the OTT platform has to implement a reliable age verification mechanism for viewership of such content & take all measures to restrict access to such content by a child through the implementation of appropriate access control measures. (Rule 9 & Para II(B) & II(D), Appendix, Intermediary Guidelines)

Section 9(1) of the Digital Personal Data Protection Act, 2023 requires Data Fiduciaries to obtain verifiable consent from parents or guardians before processing children’s data.

The proposed Rule 10(1) of the Draft Digital Personal Data Protection Rules, 2025 (Draft DPDP Rules) proposes that technical and organisational measures should be adopted for ensuring that verifiable consent of the parent is obtained prior to processing of a child’s personal data and observe due diligence for checking that the person identifying as the parent is an identifiable adult by either referring to the reliable age and identity details available with the Data Fiduciary or voluntarily provided age and identity details provided by the parent or a virtual token mapped to the same issued by an authorised entity/Government.

Further, proposed Rule 11 of the Draft DPDP Rules exempts certain entities (for the corresponding purposes identified in the Fourth Schedule of the Draft DPDP Rules) from the requirement of obtaining verifiable consent and the prohibition from undertaking tracking or behavioural monitoring of children or targeted advertising directed at children.

The entities that are proposed to have been exempted from the above-mentioned requirements are clinical establishments, mental health establishments, healthcare professionals, allied healthcare professionals, educational institutions, caregivers for infants and children in crèches or daycare centres, and individuals engaged by these institutions for transporting children.

The exemption has been granted in respect of corresponding processing activity that is restricted to providing health services, supporting healthcare treatments and recommended referral plans, tracking and monitoring behaviour for educational purposes, ensuring the safety of enrolled children, and tracking the location, as identified in the Fourth Schedule of the Draft DPDP Rules.

Has there been any regulatory enforcement action concerning online/ digital safety? In your answer, please include information on the volume, nature and severity of sanctions.

Yes.

In 2017, The Ministry of Electronics and Information Technology issued an order directing all Internet Service Providers (ISPs) having Cable Landing Station Gateways/International Long-Distance licenses in India to adopt and implement the Internet Watch Foundation Resources that contain an updated list of websites/URLs that contain online CSAM and disable/remove access to the same. The order further directed ISPs to observe the due diligence requirements including the obligation to expeditiously remove or disable access to any unlawful content brought to its notice by the relevant authorities. (Section 79(2)(c) and Section 2(1)(w) of the Information Technology Act, 2000)

In Case No (2024 INSC 716), the Supreme Court of India held that watching and storage of child porn without deleting or reporting the same, would indicate an intention to transmit and accordingly would be an offence under the Protection of Children from Sexual Offences Act, 2012.

In 2020, teenage boys were involved in creating an Instagram group called ‘Bois Locker Room’ where they shared objectionable images of girls. The Delhi Policy Cyber Cell took suo moto cognisance of the matter and filed First Information Reports, questioned 12 minors and 15 adults, seized their phones for scrutiny; and arrested a few members who were later released on bail. The case is currently pending before Chief Metropolitan Magistrate.

In Case No (2023 SCC OnLine Del 2268), the Delhi High Court directed a search engine company to restrain the dissemination of videos on a video sharing platform that showed the plaintiff who was a minor at the time as critically ill or deceased using morphed images. The court emphasised the protection of a child’s right to privacy and the need to prevent the spread of misleading information about a minor’s health and well-being. The court took a strict stance highlighting that it will show zero tolerance in cases concerning harm to children and granted an ad interim injunction in favour of the Plaintiff restraining the defendants from disseminating or further transmitting the videos, creating, publishing, uploading, sharing, or disseminating any identical videos that had similar content, take down URLs related to the physical health and well-being of the plaintiff whenever brought to their notice by the plaintiff among other things.

Further, in January 2024, the National Commission for Protection of Child Rights summoned a video sharing platform over the platform’s alleged failure to curb child sexual abuse material and other content harmful to children. Further, criminal complaints (or First Information Reports) have also been filed by the cyber cell of Maharashtra police against platform officials and some of the people running channels that hosted alleged child sexual abuse material and other content harmful to children.

Artificial intelligence

Back to top

Are there any existing requirements relating to children and AI in your jurisdiction?

No.

N/A

Are there any upcoming requirements relating to children and AI in your jurisdiction?

No.

However, a Report on AI Governance and Guidelines Development by a Sub-Committee constituted by Ministry of Electronics and Information Technology (MeitY) was issued on 6 January 2025 that seeks public comments on developing the AI governance framework in India. In this report, one of the recommendations of the Sub-Committee is regarding the formation of a sub-group that will work with the MeitY to suggest specific measures that may be considered under the proposed Digital India Act to strengthen and harmonise the legal framework, regulatory and technical capacity and the adjudicatory set-up for the digital industries.

Has there been any other regulatory enforcement activity to date relevant to children’s use of digital services? In your answer, please include information on the volume, nature and severity of sanctions.

Yes.

Various ministries and regulatory authorities have, from time to time, issued advisories to various digital platforms with a particular focus on the impact on the youth. For instance, the Central Consumer Protection Authority and the Ministry of Information & Broadcasting have issued advisories regarding the endorsement of betting and gambling activities by celebrities and influencers and also to social media platforms to disable, access to posts, links, etc. where advertisements and branded content of offshore online betting and gambling platforms are being published.

The Ministry of Electronics and Information Technology has issued notices to social media platforms requiring them to ensure there is no Child Sexual Abuse Material on their platforms.

Are there any other existing or upcoming requirements relevant to children’s use of digital services?

Yes.

The Ministry of Electronics and Information Technology has proposed the Digital India Act (DIA) which would replace the Information Technology Act, 2000. While the draft of the DIA has not been circulated, as per a presentation made available by the government, the DIA will have provisions particularly addressing issues of child safety and will introduce age-gating by regulating addictive tech and protecting minors’ data, ensuring safety and privacy of children on social media platforms, gaming and betting apps and also introduce mandatory ‘do not track’ requirement to avoid children as data subjects for ad targeting, etc.

Contributors

Suvarna Mandal

Saikrishna & Associates

Get in touch with us
Back to the world map
Compare Countries
See next Country