Turkey

Global Comparative Review

Start reading

Which topic would you like to know more about?

Children's rights

Learn more

Data protection and privacy

Learn more

Electronic direct marketing and advertising

Learn more

Consumer protection

Learn more

Online Digital Safety

Learn more

Artificial intelligence

Learn more

Other issues relating to children's use of digital services

Learn more

Children's rights

Back to top

What is the legal age of adulthood/ majority in your jurisdiction? Are all persons below this age considered a child/ minor?

The age of majority is considered to be 18 as per the Article 11 of the Turkish Civil Code.

In principle, all persons under the age of 18 are considered minors. However, there are certain exceptions, such as a minor completing the age of 15 applying to court for majority with parental consent.

Has the UNCRC been directly incorporated into national law in your jurisdiction?

Yes.

Türkiye is a signatory of the UNCRC and has directly incorporated the same into national law. The UNCRC has been in effect in Türkiye as of 1995.

Is there an ombudsperson/ commissioner for children in your jurisdiction?

Yes.

As per the obligations set forth under the UNCRC, a Child Rights Monitoring and Evaluation Board (“CRME Board”) has been established and recently reorganized in 2023.

Moreover, due to rising need of a sub-commission to work with non-governmental organizations, international organizations and professionals in order to promote children’s rights and child protection, a Child Rights Sub-Commission to the Human Rights Inquiry Commission was established within the Turkish Parliament. This Sub-Commission is expected to come to the forefront in regulatory manners during 2025 and the upcoming years.

If there is an ombudsperson/ commissioner for children in your jurisdiction, do they have any responsibility for upholding children’s rights in the digital world or does the relevant regulator have sole competence?

Yes.

The Child Rights Monitoring and Evaluation Board (CRME) Board has been established to carry out studies on administrative and legal regulations on issues related to the protection, utilization and development of children’s rights, to make recommendations, to evaluate the studies to be carried out in order to enlighten the public on the developments made, to make recommendations regarding the measures that can be taken on children’s rights, to prepare and approve strategy documents and action plans on this subject, to ensure inter-agency cooperation and coordination on children’s rights. The obligations of the CRME Board are rather vast and are not in direct relation to the digital world. Please also refer to the explanations regarding the Child Rights Sub-Commission on the answer above.

Data protection and privacy

Back to top

Is there any standalone requirement to collect the consent of one or both parents when processing a child’s personal data (irrespective of any other obligations, e.g. the requirement to have a legal basis for processing)?

No.

Obtaining the consent of the parent or guardian should be evaluated on a case by case basis and by taking into consideration the nature of the legal transaction/activity. Such a general rule, requiring parental consent in all cases, is not applicable. However, it should be noted that there may be instances where it is recommended to request the consent of the child as well as the parent or guardian.

At what age can children legally consent to the processing of their own personal data, such that parental permission/ consent is not required?

Depending on the circumstance.

The Law on the Protection of Personal Data numbered 6698 (“DP Law”) does not set a specific age at which a child may consent without parental permission. Instead, the appropriate age depends on the context of the consent. In practice, ages 13, 15, and 18 are considered, depending on the legal transaction or activity and the potential consequences involved. Consequently, a minor’s capacity to make an informed decision is assessed by reference to various laws, including the Civil Code, the Code of Obligations, and the Criminal Code.

Are there specific requirements in relation to collection and/or verification of parental consent/ permission concerning the processing of a child’s personal data?

No.

There is no specific rule governing parental consent or permission. In general, however, and similar to the EU framework, for explicit consent to be valid it must be (a) freely given, (b) specific, and (c) informed. Obtaining explicit consent through a soft opt-in—where consent is presumed unless the data subject opts out—renders such consent invalid.

For explicit consent to be valid, it must be tailored to the specific purpose of the processing activity, and its language must be clear and unambiguous. Consequently, blanket or general consents do not meet the requirements.

Please see this question for further explanations regarding the criterion of being “informed”.

Additionally, it is advisable to review any sector-specific requirements before commencing processing—for example, obtaining parental permission in the health sector when accessing minors’ health data.

Are there any particular information or transparency requirements concerning the processing of children’s personal data?

Yes.

According to the Law on the Protection of Personal Data numbered 6698 (DP Law) and Communiqué on Obligation to Inform, a privacy notice, at minimum, should include (i) identity of the data controller and its representative (if any), (ii) the purposes of processing, (iii) the transfer recipients and transfer purposes, (iv) data collection methods and legal basis, (v) the rights of data subjects as stipulated within the DP Law.

These details should be communicated in language that is easily understandable by children. Additionally, while not strictly mandated, the Turkish Personal Data Protection Board (“DP Board”) recommends incorporating visual elements into privacy notices aimed at children.

Can children directly exercise their rights in relation to their personal data without the involvement of their parents?

Yes.

In accordance with Article 16 of the Turkish Civil Code, minors who have the power of discernment are able to use the rights closely linked to persons (e.g. right to privacy) without parental consent. This approach is also enacted by the Turkish Personal Data Protection Board with its decision numbered 2020/622.

Can children make complaints on their own behalf directly to your national data protection/ privacy regulator(s)?

Yes.

In accordance with Article 16 of the Turkish Civil Code, minors who have the power of discernment are able to use the rights closely linked to persons (e.g. right to privacy) without parental consent.

Are there any particular requirements/ prohibitions related to:

a. processing specific types of children’s personal data;

b. carrying out specific processing activities involving children’s personal data; and/ or

c. using children’s personal data for specific purposes.

No.

There is no specific prohibition regarding children’s data. However, the general principles outlined in Article 4 of the Law on the Protection of Personal Data numbered 6698 (DP Law) must be applied more strictly, with particular attention to the purpose limitation rule.

Has there been any enforcement action by your national data protection/ privacy regulator(s) concerning the processing of children’s personal data? In your answer, please include information on the volume, nature and severity of sanctions.

Yes.

There have been several decisions on children’s personal data made by the Turkish Personal Data Protection Board (DP Board). Please note that the DP Board has exclusive authority to publicize its decisions; consequently, some publications may not include complete statistics or detailed information on administrative fine amounts, and certain decisions may not be published at all.

In one case involving the processing of special categories of children’s personal data without parental consent—and without providing adequate information—the DP Board imposed an administrative fine and directed the data controller to rectify several instances of non-compliance. These included deficiencies in data security (to prevent unlawful processing), failures to meet information obligations, and lapses in the deletion, destruction, or anonymization of personal data.

A notable trend is that if foreign regulators have fined a data controller for similar issues, there is an increased risk that the Turkish regulator will also impose a fine. Additionally, when it comes to child-centric approaches, the DP Board stresses that privacy content must be designed to be appropriate for children and that the role of parents or guardians is critical when handling data subject requests.

For context, one DP Board decision regarding the processing of sensitive personal data of children without parental consent resulted in an administrative fine of approximately 30% of the then-applicable upper limit, with the primary focus on mandating corrective measures.

Electronic direct marketing and advertising

Back to top

Are there specific rules concerning electronic direct marketing to children?

No.

There is no specific rule exclusively addressing children.

In principle, commercial electronic messages require the recipient's consent, with certain exceptions—such as for B2B communications. Accordingly, as explained in the questions here and here, it should be assessed whether parental consent is necessary.

Are there specific rules concerning the use of adtech tracking technologies, profiling and/or online targeted advertising to children?

Yes.

Although there is no overarching rule specifically addressing this issue, advertisement law has particular restrictions.

The Regulation on Commercial Advertisement and Unjust Commercial Practices (“Ad Regulation”) sets out specific requirements for advertisements that (i) are directed at children, (ii) have the potential to affect children, or (iii) feature children (“Children-Specific Advertisements”). Within this framework, there are 17 sub-rules that govern either the content of the advertisement or how it is presented and used. Additionally, Article 24/2 of the Ad Regulation explicitly prohibits advertisements directly targeting children with respect to distance contracts containing an invitation to purchase.

In any case, it is essential to review any sector-specific restrictions before implementing adtech tracking technologies, profiling, or online targeted advertising aimed at children.

Are there specific rules concerning online contextual advertising to children?

Yes.

The restrictions outlined here for Children-Specific Advertisements are also applicable to online advertisements.

Accordingly, Children-Specific Advertisements cannot, among others, contain any expression or image that may adversely affect the physical, mental, moral, psychological and social development characteristics of children; contain messages that having or using a particular product will give the child a physical, social or psychological advantage over other children of his/her age, or that not having that product will have the opposite effect; contain elements of violence that children can imitate; contain elements intended to disrupt, change or denigrate cultural, moral and positive social behaviours; directly encourage children to persuade their parents or others to acquire a good or service.

Has there been any regulatory enforcement action concerning advertising to children? In your answer, please include information on the volume, nature and severity of sanctions.

Yes.

The Advertisement Board has resolved several cases concerning unlawful advertisements targeting children. Enforcement actions have primarily included suspending the unlawful advertisements and imposing administrative fines. The Board emphasizes the importance of protecting children’s emotional well-being and innocence, as well as preventing any harm to their mental or psychological health, societal development, or cultural and moral values.

Consumer protection

Back to top

At what age does a person acquire contractual capacity to enter into an agreement to use digital services?

Depending on the circumstance.

In accordance with Article 10 of the Turkish Civil Code, every person who (i) is not a minor, (ii) is not restricted and (iii) has power of discernment, has the capacity to act. For further details, please refer to the information here and here. Additionally, please note that parental consent is not a universal requirement for the execution of all contracts; instead, the applicable requirements should be evaluated on a per digital service basis.

Do consumer protection rules apply to children?

Yes.

The Law on Protection of Consumers No. 6502 applies to all natural and legal persons acting for non-commercial or professional purposes. Consequently, the consumer protections established under this law are also applicable to minors.

Are there any consumer protection rules which are specific to children only?

Yes.

Please see the information here and here as such regulations are based on the consumer protection regime of Turkish legislation.

Has there been any regulatory enforcement action concerning consumer protection requirements and children’s use of digital services? In your answer, please include information on the volume, nature and severity of sanctions.

Yes.

Please refer to the information here, as the Advertisement Board was established under the consumer protection regime.

Although this is not directly related to children’s use of digital services, the Directorate General for Consumer Protection and Market Surveillance does conduct market surveillance regarding products marketed for children. Additionally, please note that non-published regulatory enforcement actions may apply on a case-by-case basis.

Online Digital Safety

Back to top

Are there any age-related restrictions on when children can legally access online/ digital services?

No.

Currently, there are no legislative requirements regarding age restrictions. However, it is worth noting that the Ministry of Family and Social Services (“Ministry”) is evaluating the adoption of age restrictions—potentially 13 or 16—for social network providers, drawing on global examples. These proposals are expected to be considered within the course of 2025.

Are there any specific requirements relating to online/ digital safety for children?

Yes.

Although not specific for children, there are two main regulations pertaining to online / digital safety, being (i) the Law No. 5651 on Regulation of Broadcasts via Internet and Prevention of Crimes Committed Through Such Broadcasts (“Internet Law”) and the (ii) Law on the Establishment and Broadcasting Services of Radio and Television numbered 6112 (“RTUK Law”) and their secondary legislation. “RTUK” refers to Radio and Television Supreme Council, which is the regulatory body responsible for regulation and supervision of media services.

For matters relating to internet services, the Internet Law regulates the obligations and responsibilities of content, hosting, access and collective use providers, as well as social network providers, in combating crime committed through such media. In a general manner, the Internet Law stipulates certain instances where enforcement actions - such as content removal and/or access blocking – can be taken by either the relevant regulatory body (Information Technologies and Communications Authority (“ITCA”) or judicial authorities. Although these are not child-specific regulations, they are widely used to protect children online, and can be directed against content, hosting, access providers and social network providers.

In this respect, Article 8 of the Internet Law stipulates content removal / access blocking in case the content constitutes one of the catalogue crimes listed in that article, which, amongst others, include sexual abuse of children. Article 8/A of the Internet Law regulates certain instances where the decision of content removal and/or access blocking can be made in non-delayable cases concerning issues such as public order, prevention of criminal activity, which allows judicial authorities, the President of the country, and relevant ministers (including those responsible for family and children) to request content removal/access blocking. Article 9/A of the Internet Law allows persons to request access blocking on the grounds of violation of private life, which can also be exercised effectively for children. Lastly, please note that the now repealed Article 9 of the Internet Law previously provided the opportunity to request content removal/access blocking in cases where there is a violation of personal rights.

In addition to these general obligations regarding takedown requests, Internet Law, in a more specific manner, bestows obligations upon social network providers. In this respect, social network providers are obligated to undertake necessary precautions to offer differentiated services specific to children. For completeness, as of February 2025, the Information Technologies and Communications Authority recognizes a number of social network providers as ones with more than one million daily access from Türkiye. In this respect, social network providers, when providing commercials, content and other services to users who can be identified as children, shall take into consideration:

  • The age of the child,
  • Consideration of the best interests of the child,
  • Protection of the child's physical, psychological and emotional development,
  • Preventing the risks of child sexual abuse and commercial exploitation,
  • Ensuring a high level of privacy settings and minimal data processing to protect the child's personal data,
  • Presenting matters such as the contract, user settings and data policies in a way that the child can understand.

On the other hand, RTUK Law and its secondary legislation Regulation on the Provision of Radio, Television and On-Demand Media Services via the Internet Environment (“On-Demand Media Regulation”) regulate and supervise online media services, specifically on-demand media services. On-Demand Media Regulation extends the licensing, content and advertisement related regulation and supervision powers of RTUK to cover on-demand media service providers . Accordingly, obligations that were designated for and applicable to the conventional broadcasters in RTUK’s regulation and supervision scope, including principles for media services, are deemed to be applicable to the online service providers, while they are as strictly enforced as for traditional media. There are a number of provisions within the RTUK legislation aimed at protecting children, for instance, as per the general principles of media services, (i) broadcasts cannot contain abuse or incite violence against children, (ii) any kind of discrimination, physical, emotional, verbal and sexual violence against children shall not be encouraged, (iii) the content of the scenes of the programs in which children have roles or participate shall not comprise elements that could impair the physical, mental or emotional development of them, (iv) any award intended for children should be convenient to their ages. The principle that media services shall not be contrary to the national and moral values of the society, general morality and the “principle of protection of family” is widely used to prevent any kind of broadcast deemed likely to impair the physical, mental or moral development of minors and young people.

There are also certain rules regarding commercial communications and product placement made in on-demand media services. In this respect, the general principles for commercial communications in broadcasting services stipulate that such communications cannot harm the physical, mental or moral development of children, exploit their inexperience or naivety, direct children to purchase or rent a product or service; directly encourage children to persuade their parents or others to purchase the advertised product or service; exploit children's trust in their parents, teachers or other persons; or place children in dangerous situations without cause. Additionally, product placement is not allowed for programs related to children.

As further detailed here and here RTUK legislation sets forth the rules regarding how media services that can be harmful to children can be offered and imposes the implementation of parental control.

Are there specific age verification/ age assurance requirements concerning access to online/ digital services?

No.

Under the Internet Law, there is no legislative requirement in force with respect to age restrictions. However, it should be noted that the Ministry has announced the works being undertaken to ensure age restrictions of 13 or 16 are being evaluated to be adopted for social network providers within the course of 2025. This legislative amendment is likely to introduce age verification/assurance requirements for social network providers.

In terms of the RTUK Law and secondary legislation (see more information on this here), age verification/ age assurance requirements are not applicable, however, a requirement on parental controls, (see more information here) applies. Moreover, in accordance with the Regulation on the Procedures and Principles of Broadcasting Services (“Broadcasting Regulation”), on-demand media service providers shall ensure that media services, which are likely to impair the physical, mental or moral development of minors and young people, are only made available in such a way that they will not normally hear or see such services. Such programs shall not be broadcasted without encryption or any other similar protection system and without precautions to be taken to make sure that the subscriber is an adult, and in a way accessible by children.

In cases where an on-demand media service provider ensures that its service is based on a membership system for a private use only to which no child can become a member, the restrictions set forth above on content restrictions will not be applicable.

Are there requirements to implement parental controls and/or facilitate parental involvement in children’s use of digital services?

Yes.

With respect to the Internet Law, although parental controls are not yet applicable, such developments may arise with the expected implementation of age assurance.

In terms of RTUK legislation (see more information here), On-Demand Media Regulation mandates that on-demand media service providers are obliged to take measures to ensure parental control of broadcasts that may harm the physical, mental or moral development of children.

Has there been any regulatory enforcement action concerning online/ digital safety? In your answer, please include information on the volume, nature and severity of sanctions.

No.

Under Articles 8-9/A of the Internet Law (see more information here), content, hosting, and access providers, as well as social network providers, may face administrative or judicial fines if they fail to comply with content removal or access blocking orders. In addition, social network providers can be subject to further sanctions, such as bans on receiving advertisements from taxpayers in Türkiye (ad ban) or bandwidth throttling. However, in practice, fines or ad restrictions are rarely imposed solely for non-compliance with these orders. Instead, if blocking a specific URL is not feasible to prevent the unlawfulness, or if the content involves sensitive societal issues, or if violations become widespread and systematic on the platform, authorities may resort to full-site access bans or bandwidth restrictions. Such actions are particularly taken when protecting children online is a concern, as seen in recent full-site access blocking cases.

Moreover, although non-compliance with the obligation of offering differentiated services specific to children may result in an administrative fine up to three percent of social network providers’ global turnover, no administrative fine in this respect has been publicized yet.

As for enforcement actions pertaining to RTUK Law (see more information here), in principle, all restrictions applicable to traditional media are also applicable to on-demand media services. However, in practice, RTUK is more lenient towards on-demand services as parental controls and other provisions as described here are in force with respect to on-demand services. However, even though RTUK may be considered more lenient, there has been instances where content has been removed from on-demand services on the grounds of abusing children and immoral content.

There have also been certain instances where the RTUK authority attempted to amend/change scenario content on the grounds of sensitivity towards children. Another enforcement trend is that Turkish productions face a higher risk of such measures, and these restrictions tend to be stricter for traditional media compared to on-demand media.

It is also worth mentioning that RTUK is considering expanding the scope of its supervision authority to cover channels that are regularly (and in a categoric way) streaming on video sharing platforms.

Artificial intelligence

Back to top

Are there any existing requirements relating to children and AI in your jurisdiction?

No.

The current regulatory landscape of Türkiye does not yet have general AI requirements, or AI requirements specific to children.

However the Turkish Personal Data Protection Authority (“DP Authority”) has certain non-binding recommendation publications for protection of personal data in the field of AI. Please note that these principles do not have any child centric provisions, are very general and are in alignment with the OECD principles.

Are there any upcoming requirements relating to children and AI in your jurisdiction?

No.

Although, please note that:

• An Artificial Intelligence Law based on the EU AI Act exists, albeit (i) being very generic, brief and lacking child centric provisions and (ii) having low chance of enactment.

• Updated AI Action Plan of Türkiye - which also does not have any child centric objectives – stipulates that national legislation will be introduced in compliance with international norms regulating the development and use of AI systems and the placing of AI systems on the market. The updated AI Action Plan appears to be rather ambitious in its target to conclude the legislation efforts by 2025. However, this is evaluated to be a low probability considering efforts have not yet been initiated in this regard.

• Lastly, during October 2024, an AI Research Commission was also established by the General Assembly, with the purposes of determining the steps to be taken for the gains of AI, to establishing the legal infrastructure in this field and determining the measures to prevent the risks of AI usage. The members of the Research Commission were appointed in January 2025 and the Research Commission was established for a period of 3 months.

Has there been any other regulatory enforcement activity to date relevant to children’s use of digital services? In your answer, please include information on the volume, nature and severity of sanctions.

No.

N/A

Are there any other existing or upcoming requirements relevant to children’s use of digital services?

Yes.

Although an official publication has not yet been made, Ministry noted that that a roadmap for combating digital addiction will be established, with the results and recommendations being shared at a future workshop with public institutions.

Moreover, as indicated above, the Information Technologies and Communications Authority and the Ministry are working on regulations regarding children's use of social media. In this respect, they are working on legal regulations to protect children under the age of 13 or 16 on digital media platforms and advocate for making age-verification systems mandatory for these platforms.

Presidential Circular No. 2022/1 on the press and publishing activities is also noteworthy as it involves critical guidance to public institutions when supervising the press and publishing activities that are likely to harm children. Accordingly, the respective circular stipulates that all necessary measures to ensure the protection of children and young adults from dangerous materials within social media and any and all written, verbal and visual content shall be effectively taken and enforcement actions shall be taken otherwise and family and child friendly productions shall be encouraged.

The Law on the Protection of Minors from Harmful Publications, imposes certain restrictions and requirements (such as branding the products as “harmful to children”) for the sale of books that are deemed harmful to children. Although this matter is not in direct relation to digital services, this may be relevant when considering the impact on sales and marketing processes of certain products. Although this is not specific to the digital environment, such restrictions also apply to e-commerce and are therefore expected to be enforced on online platforms selling books that are not appropriate for children.

Regarding the organization of tournaments and lotteries, the General Directorate of National Lottery Administration’s Regulation on Non-Cash Lotteries and Giveaways strictly prohibits the participation of minors in such activities. Consequently, this restriction may also extend to lotteries, giveaways and tournaments organized by online gaming platforms.

Contributors

Yasin Beceni

BTS & Partners

Ece Özelgin

BTS & Partners

Melis Mert

BTS & Partners

Miray Muratoğlu

BTS & Partners

Dila Ay Kocabiyik

BTS & Partners

Get in touch with us
Back to the world map
Compare Countries
See next Country