Media, Data Protection & Digital Privacy (IT Act, 2000; Digital Personal Data Protection Act, 2023)
- Lawcurb
- Jan 27
- 18 min read
Abstract
The digital transformation of the media landscape has irrevocably altered the dynamics of information creation, dissemination, and consumption. In India, this transformation has unfolded within a legal framework historically anchored by the Information Technology Act, 2000 (IT Act). While the IT Act provided initial foundational rules for the digital ecosystem, including provisions for data protection and intermediary liability, it proved increasingly inadequate in addressing the complex privacy challenges posed by the data-driven business models of modern digital media platforms. The long-awaited enactment of the Digital Personal Data Protection Act, 2023 (DPDP Act) marks a paradigm shift, establishing India’s first comprehensive, cross-sectoral data protection regime based on the principle of consent. This article provides a detailed analysis of the evolution of digital privacy in India, critically examining the provisions of the IT Act, 2000 and the transformative implications of the DPDP Act, 2023 for the media sector. It explores the tension between media freedoms—including journalism, advertising, and user-generated content—and the newly codified rights of data principals. The article delves into key areas such as the redefinition of intermediary liabilities, the challenges of obtaining valid consent for journalism and targeted advertising, the nuances of lawful exemptions, and the compliance burdens on media entities. Ultimately, it argues that while the DPDP Act provides a robust framework for protecting individual privacy, its successful implementation in the media context requires a delicate balance, ensuring that the fundamental right to freedom of speech and expression is not unduly curtailed, and that innovation in the digital media space can continue responsibly.
Introduction
The media industry, traditionally encompassing print, radio, and television, has undergone a seismic shift with the advent of the internet and digital technologies. Today, media is predominantly digital, interactive, and personalized. News websites, streaming platforms, social media networks, and content aggregators form the backbone of modern information exchange. This digital media ecosystem is inherently data-centric. User engagement, preferences, location, device information, and behavioral patterns are continuously collected, processed, and analyzed to curate content, personalize news feeds, target advertisements with pinpoint accuracy, and build detailed psychographic profiles.
This pervasive data collection raises profound questions about digital privacy—the right of an individual to control their digital footprint and personal information. For decades in India, the right to privacy was a contested fundamental right, finally unequivocally affirmed by the Supreme Court in the landmark Justice K.S. Puttaswamy (Retd.) vs Union of India (2017) judgment. The Court declared privacy to be intrinsic to life and liberty under Article 21 of the Constitution and recognized informational privacy as a crucial aspect of this right. This judicial pronouncement created an imperative for a dedicated data protection law.
Prior to this, the primary legal instrument governing the digital domain was the Information Technology Act, 2000, amended significantly in 2008. It served as a framework for e-commerce, cybercrime, and digital signatures, with specific sections addressing data protection and the liability of intermediaries (like social media platforms and news websites). However, its approach was reactive, penal, and limited in scope, failing to provide a proactive set of rights and principles for data protection.
The Digital Personal Data Protection Act, 2023, enacted after extensive consultations and multiple draft bills, is designed to fill this legislative void. It establishes a comprehensive regime governing the processing of digital personal data, granting individuals (data principals) specific rights and imposing clear obligations on entities processing data (data fiduciaries). For the media sector, this new law is transformative and disruptive. It compels a fundamental re-evaluation of long-standing practices in digital journalism, advertising, audience analytics, and platform governance. The core challenge lies in reconciling two fundamental rights: the right to privacy of the individual and the right to freedom of speech and expression of the media under Article 19(1)(a) of the Constitution.
This article will provide a detailed exposition of this complex interplay. It will begin by dissecting the relevant provisions of the IT Act, 2000, highlighting its contributions and limitations. It will then proceed to a thorough analysis of the DPDP Act, 2023, unpacking its key definitions, principles, rights, and obligations. The heart of the article will focus on the specific implications of both Acts for various media stakeholders—publishers, broadcasters, digital-only news platforms, advertising networks, and social media intermediaries. Finally, it will conclude by assessing the path forward, identifying critical areas of compliance, potential conflicts, and the evolving landscape of digital privacy in Indian media.
Part I: The Foundation – Information Technology Act, 2000 and its Provisions
The Information Technology Act, 2000, was India’s pioneering step towards providing legal recognition to electronic transactions and digital commerce. Its ambit was broad, covering aspects from digital signatures to cybercrimes. In the context of data protection and media, a few specific sections are of paramount importance, particularly after the 2008 amendment which introduced Section 66A (struck down in 2015), Section 69 (powers of interception), and, most significantly, Section 43A and Section 72A.
1. Data Protection under the IT Act: A Limited Framework
» Section 43A: Compensation for Failure to Protect Data: This was the closest the IT Act came to a data protection provision before 2023. It states that if a body corporate (any company, firm, etc.) possessing, dealing, or handling any "sensitive personal data or information" (SPDI) in a computer resource is negligent in implementing and maintaining "reasonable security practices and procedures," and thereby causes wrongful loss or gain to any person, the body corporate is liable to pay damages by way of compensation to the affected person. The key limitations were:
» Scope: It applied only to "sensitive personal data," a term defined under the IT Rules, 2011, and only to "body corporates," leaving government entities potentially outside its purview.
» Reactive, Not Proactive: It triggered liability only upon a breach resulting in harm, rather than mandating a set of preventive principles for all data processing.
» No Enumerated Rights: It did not grant individuals any positive rights like the right to access, correct, or erase their data.
» Section 72A: Punishment for Disclosure in Breach of Lawful Contract: This section prescribes punishment (imprisonment up to three years and/or a fine) for a person who, while providing services under a lawful contract, secures access to any material containing personal information and, with the intent to cause wrongful loss or gain, discloses that information without the consent of the person concerned or in breach of that lawful contract. This provision was narrow, focusing on contractual breaches rather than establishing a general privacy standard.
The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011: These rules, framed under Section 43A, provided some operational detail. They defined SPDI to include passwords, financial information, health data, sexual orientation, and biometric information. They also required body corporates to publish a privacy policy, obtain written consent for collection and use of SPDI, and allow providers of information to review and correct their data. However, their applicability was limited and enforcement was weak.
2. Intermediary Liability and Safe Harbour: The Bedrock of Digital Media (Section 79)
For the digital media ecosystem, Section 79 of the IT Act is arguably its most critical provision. It provides a "safe harbour" or conditional immunity to "intermediaries." An intermediary is defined broadly as any entity that receives, stores, or transmits a record on behalf of another or provides any service with respect to that record. This includes telecom service providers, internet service providers (ISPs), web hosting services, online marketplaces, search engines, and, most pivotally, social media platforms and digital media websites.
» The Conditional Immunity: Section 79(1) states that an intermediary shall not be liable for any third-party information, data, or communication link made available or hosted by it. This immunity is the legal foundation that allows platforms like Facebook, Twitter (X), YouTube, Instagram, and news websites with comment sections to operate without being held legally liable for every piece of content uploaded by their users.
» Conditions for Immunity (Section 79(2) & (3)): The immunity is not absolute. It is contingent upon the intermediary fulfilling specific conditions:
• The intermediary's function is limited to providing access to a communication system.
• It does not initiate the transmission, select the receiver, or modify the information contained in the transmission.
• It must observe "due diligence" as prescribed by the Central Government.
• It must expeditiously disable or remove access to unlawful content upon receiving "actual knowledge" of such content through a court order or notification by an appropriate government agency. This is the core of the "notice-and-takedown" regime.
The Intermediary Guidelines and Digital Media Ethics Code, 2021: The "due diligence" condition was elaborated in the 2021 Rules, which significantly increased the compliance burden on "significant social media intermediaries" (SSMIs with a large user base). Key mandates for media platforms include:
• Appointment of resident Chief Compliance Officer, Nodal Contact Person, and Grievance Officer.
• Proactive identification and removal of content using automated tools.
• Implementation of a grievance redressal mechanism with strict timelines.
• Enabling traceability of the "first originator" of messages on end-to-end encrypted platforms (a highly contentious provision).
• For digital news publishers and curated content platforms (OTT), the Rules mandated a separate three-tier grievance redressal structure and adherence to a "Code of Ethics."
The IT Act framework, therefore, created a dual regime for media: a weak, breach-based data protection system under Sections 43A and 72A, and a conditional liability shield for platforms under Section 79, increasingly conditioned on active content moderation as per government-prescribed norms.
Part II: The Paradigm Shift – Digital Personal Data Protection Act, 2023
The DPDP Act, 2023, represents a fundamental overhaul of India's data protection landscape. It is a principle-based legislation that places the individual (data principal) at the center and establishes clear obligations for data fiduciaries.
1. Key Definitions and Scope
» Personal Data: Any data about an individual who is identifiable by or in relation to such data.
» Processing: A wholly automated operation or set of operations performed on digital personal data, including collection, storage, use, sharing, etc.
» Data Principal: The individual to whom the personal data relates. In the case of a child (under 18), this includes their parents/legal guardian.
» Data Fiduciary: The entity (individual, company, or state) that alone or in conjunction with others determines the purpose and means of processing personal data. A Significant Data Fiduciary (SDF) is a class of entities notified by the government based on volume, sensitivity, and risk, who bear additional obligations.
» Data Processor: An entity that processes data on behalf of a data fiduciary.
» Scope: The Act applies to the processing of digital personal data within India, whether collected online or offline and subsequently digitized. It also applies to processing outside India if it is for offering goods or services to data principals in India.
2. The Seven Principles of Data Processing
The Act mandates that processing must be done in a manner that is:
• Lawful, Fair, and Transparent.
• Purpose Limited to what is specified to the data principal.
• Data Minimized to what is necessary for the stated purpose.
• Accurate and complete.
• Stored only as long as necessary.
• Secure through reasonable safeguards.
• Accountable – the data fiduciary is responsible for compliance.
3. Lawful Bases for Processing: Beyond Consent
While consent is the primary lawful ground, the Act introduces critical exemptions:
» Voluntary Provision: When a data principal voluntarily provides their data without indicating they do not consent (e.g., writing a letter to the editor with contact details).
» Legitimate Uses: A crucial category that includes processing for:
• Specified purposes where the data principal has previously not objected.
• Employment-related purposes, including recruitment.
• Public interest, such as preventing fraud, mergers & acquisitions, network security, etc.
• Deemed Consent: For specific, notified purposes like fraud detection, debt recovery, etc.
Certain Legitimate Uses (Section 8): This is a vital category for the state and others, allowing processing without consent for purposes like state functions, compliance with law or court orders, medical emergencies, and public interest (which is broadly defined and could encompass certain journalistic activities, though not explicitly).
4. Rights of the Data Principal
The Act grants several enforceable rights:
• Right to access information about personal data being processed.
• Right to correction and erasure of personal data.
• Right to grievance redressal.
• Right to nominate another individual to exercise rights in case of death or incapacity.
5. Obligations of the Data Fiduciary
Data fiduciaries must:
• Implement technical and organizational measures to ensure compliance.
• Notify the Data Protection Board (DPB) and affected data principals in case of a personal data breach.
• Erase data when the purpose is fulfilled or consent is withdrawn.
• Appoint a Data Protection Officer (DPO) and an Independent Data Auditor if designated as a Significant Data Fiduciary (SDF).
6. The Data Protection Board of India (DPB)
An independent adjudicatory body established to enforce the Act, inquire into breaches, impose penalties, and hear appeals from data principals.
7. Penalties
The Act prescribes substantial financial penalties, which can go up to ₹250 crore for failing to prevent a data breach or to take reasonable security safeguards.
Part III: The Media Sector at the Crossroads – Implications and Challenges
The convergence of the IT Act's intermediary rules and the DPDP Act's consent-based regime creates a complex compliance matrix for media entities.
1. Digital Journalism and News Media
» Consent for Sourcing and Reporting: A journalist building a story often collects personal data (names, statements, photographs, background information) from sources, interviewees, and public figures. Under the DPDP Act, obtaining explicit, informed consent for every piece of personal data processed is the default requirement. This could potentially chill investigative journalism where sources wish to remain anonymous or where information is gathered from public records or whistle-blowers.
» The 'Legitimate Use' and 'Public Interest' Conundrum: Will journalism qualify as a "legitimate use" or fall under "public interest" exemptions under Section 8? The Act does not explicitly list journalism, leaving it open to interpretation and potential government notification. News organizations will need to develop robust internal protocols to justify data processing under these grounds, balancing their right to free expression against the individual's right to privacy. The "public interest" defence will be crucial for stories involving corruption, public safety, or matters of governance.
» Archives and Right to Erasure ('Right to be Forgotten'): A critical tension arises between the media's role as a permanent record of events (the archival function) and a data principal's right to erasure. Can a public figure demand the deletion of an old, embarrassing but factual news report? The DPDP Act allows for erasure when the purpose of processing is fulfilled or consent is withdrawn. However, if the processing is based on legitimate use/public interest, this right may be limited. This will likely lead to significant legal disputes, requiring the Data Protection Board to weigh historical significance, public interest, and individual privacy on a case-by-case basis.
» Comments Sections and User-Generated Content (UGC): News websites hosting comments sections are intermediaries under the IT Act and data fiduciaries for the personal data (usernames, email IDs, IP addresses) of commenters under the DPDP Act. They must obtain consent for collecting this data, moderate content to retain safe harbour under IT Rules, and also respond to erasure requests from users wanting their comments and associated data deleted. This dual compliance is operationally intensive.
2. Advertising Technology (Ad-Tech) and Targeted Advertising
This is arguably the area of greatest disruption. The business model of much of the digital media (social platforms, free news sites) relies on behavioral advertising, which is built on extensive tracking and profiling of users.
» The End of Implied Consent and Profiling: Current practices often rely on opaque privacy policies and implied consent for tracking across websites and apps. The DPDP Act mandates clear, specific, and itemized consent before processing personal data for advertising. This means platforms and publishers cannot bundle consent for advertising with consent for essential services. Users must be given a clear "yes/no" option for being profiled for targeted ads.
» Impact on Revenue: If a significant portion of users opts out of tracking, the efficiency and revenue of targeted advertising will plummet. Media companies, especially those offering free content, will need to explore new revenue models, such as contextual advertising (ads based on page content, not user behavior), subscription walls (consent-for-access model), or first-party data strategies where consent is obtained directly in a transparent manner.
» Role of Data Processors: The complex ad-tech ecosystem involves data fiduciaries (the publisher), data processors (ad exchanges, demand-side platforms, analytics firms), and third parties (advertisers). The DPDP Act makes the data fiduciary (publisher/platform) ultimately responsible for ensuring that all its processors comply with the law. This necessitates stringent contractual agreements and audit mechanisms throughout the ad-tech supply chain.
3. Social Media Platforms and Intermediaries
Social media companies face the most stringent regulatory overlap.
» Dual Compliance as Intermediary and Data Fiduciary: They must comply with the IT Act's due diligence, grievance redressal, and content takedown rules (including traceability) while simultaneously fulfilling all obligations of a data fiduciary (and likely an SDF) under the DPDP Act. This includes obtaining granular consent for each data processing purpose (e.g., for friend suggestions, content personalization, ad targeting, data sharing with third parties).
» Algorithmic Transparency and User Rights: The "right to information" under the DPDP Act could be interpreted to require platforms to provide meaningful explanations about how user data influences their news feed algorithms or content recommendations—a move towards algorithmic accountability that platforms have traditionally resisted.
» Children's Data (Section 9): The Act imposes strict obligations for processing children's data (under 18). Verifiable parental consent is required. Platforms must not process data in a manner that is detrimental to a child's well-being or engage in tracking, targeted advertising, or any other monitoring that could harm a child. For social media platforms popular with minors, this requires age-gating, age-verification mechanisms, and a complete overhaul of advertising models for younger users.
4. Streaming Platforms (OTT) and Content Curators
Like digital news publishers, OTT platforms are subject to the IT Rules, 2021's Code of Ethics and grievance structure. As data fiduciaries, they process vast amounts of personal data for content recommendations, subscription management, and viewing analytics. They must:
• Obtain valid consent for creating detailed viewing profiles.
• Justify data retention periods for viewing history.
• Respond to user requests to correct or erase their viewing preferences and history, which directly impacts the functioning of their recommendation engines.
Part IV: Critical Analysis and the Path Forward
The DPDP Act, 2023, is a watershed moment, but its implementation in the media sector is fraught with unresolved tensions.
1. Balancing Privacy and Free Speech: The Act does not provide a broad, journalist-centric exemption akin to some jurisdictions. The reliance on the undefined "public interest" carve-out in Section 8 places a significant interpretive burden on the Data Protection Board and the courts. Clear guidelines or notifications specifying the conditions under which journalistic processing is exempt are urgently needed to prevent self-censorship.
2. Operational Feasibility and Compliance Costs: For small and medium-sized digital media outlets, the compliance costs of implementing consent management platforms, appointing DPOs, ensuring data portability, and managing breach responses could be prohibitive. This may inadvertently consolidate the digital media space in favor of large, well-resourced corporations.
3. The Role of the Data Protection Board (DPB): The effectiveness of the entire regime hinges on the independence, expertise, and proactive stance of the DPB. It will need to develop a nuanced jurisprudence that understands the unique role of the media. Its decisions on early cases involving media entities will set crucial precedents.
4. Synergy and Conflict with IT Act, 2000: While the DPDP Act states it will override other laws to the extent of inconsistency, the relationship with the IT Act's intermediary liability regime needs clarity. For instance, if a platform removes content to comply with an IT Act takedown notice, but that content contains the personal data of an individual who then exercises their right to erasure, how is the conflict resolved? Harmonization of rules and coordinated action between the DPB and the MeitY will be essential.
Conclusion
The journey from the IT Act, 2000 to the DPDP Act, 2023, reflects India's evolving constitutional and technological maturity in recognizing digital privacy as a fundamental right. The media sector, as both a vital pillar of democracy and a major driver of the data economy, finds itself at the epicenter of this legal transformation. The old model of unfettered data collection and behavioral profiling is legally untenable. The new regime demands transparency, purpose limitation, and genuine user autonomy.
Successfully navigating this new landscape will require media organizations to embrace "privacy by design" not as a compliance burden, but as a core ethical and operational principle. It will necessitate innovation in business models, a commitment to contextual and consensual advertising, and robust internal data governance. Legislators and regulators, in turn, must provide the clarity and balance needed to ensure that the fourth estate can continue to hold power to account without violating the citizen's private sphere.
The ultimate test of the DPDP Act will be its ability to foster a digital media ecosystem that is both vibrant and free, and respectful and protective of the individual's personal data. The interplay between Section 79 of the IT Act and the consent framework of the DPDP Act will define the future of digital expression, commercial sustainability, and informational privacy in India for decades to come. The journey has just begun, and its course will be charted in boardrooms, newsrooms, courtrooms, and within the digital interfaces where every citizen engages with the media.
Here are some questions and answers on the topic:
Question 1: How did the Information Technology Act, 2000 address data protection and privacy, and what were its key limitations before the introduction of the DPDP Act, 2023?
Answer: The Information Technology Act, 2000, primarily served as a legal framework for electronic commerce, digital signatures, and cybercrime in India. Its provisions related to data protection were limited and reactive in nature. Key sections included Section 43A, which mandated that a body corporate handling sensitive personal data must implement reasonable security practices, and if negligent, would be liable to pay compensation for any wrongful loss. Section 72A provided punishment for the disclosure of personal information in breach of a lawful contract. These sections, along with the SPDI Rules of 2011, formed the cornerstone of data protection under the IT Act. However, the framework suffered from significant limitations. It applied only to body corporates and sensitive personal data, leaving government entities and non-sensitive data largely unregulated. It did not grant individuals any proactive rights such as the right to access, correct, or erase their data. The approach was penalty-driven and triggered only after a breach occurred, rather than establishing a preventive, principle-based regime. Moreover, the IT Act lacked an independent regulatory authority dedicated to data protection. These gaps became increasingly apparent with the rise of data-driven business models and the Supreme Court’s recognition of the fundamental right to privacy in 2017, ultimately creating the need for a comprehensive law like the DPDP Act, 2023.
Question 2: What is the ‘safe harbour’ provision under Section 79 of the IT Act, 2000, and how has its conditional immunity impacted digital media platforms and intermediaries?
Answer: Section 79 of the Information Technology Act, 2000 provides a ‘safe harbour’ or conditional immunity to intermediaries, which are entities that store, transmit, or host third-party content, such as social media platforms, websites with comment sections, and content hosting services. This provision states that an intermediary shall not be held legally liable for any third-party information, data, or communication link hosted or made available through its services, provided it adheres to certain conditions. The conditions require that the intermediary’s role is limited to providing access to a communication system, it does not initiate or modify the content, it observes due diligence as prescribed by the government, and it expeditiously removes or disables access to unlawful content upon receiving actual knowledge through a court order or government notification. This conditional immunity has been fundamental to the growth of digital media platforms, allowing them to host user-generated content without facing liability for each piece of content. However, the 2021 Intermediary Guidelines significantly tightened these conditions by mandating proactive content monitoring, strict grievance redressal mechanisms, and traceability requirements for significant social media intermediaries. As a result, platforms now bear a greater burden of compliance, balancing their role as neutral conduits with increased responsibility for content moderation, which has raised concerns about impacts on freedom of expression and privacy.
Question 3: How does the Digital Personal Data Protection Act, 2023 change the paradigm of consent and lawful processing of personal data for media companies, especially in advertising and journalism?
Answer: The Digital Personal Data Protection Act, 2023 fundamentally reshapes the paradigm of consent and lawful processing for media companies by moving from implied or bundled consent to explicit, informed, and purpose-specific consent. For advertising, especially behavioral and targeted advertising, media platforms can no longer rely on lengthy privacy policies or pre-ticked boxes. The Act requires clear, itemized consent before collecting or processing personal data for advertising purposes, meaning users must be given a genuine choice to opt-in. This challenges the core revenue model of many digital media companies that depend on micro-targeted ads based on user profiling. Media entities must now explore alternative models like contextual advertising or subscription-based access. In journalism, the Act introduces complexity around sourcing and reporting. While consent remains the primary ground for processing personal data of sources, interviewees, or subjects, the Act provides exemptions for “legitimate uses” and “certain legitimate purposes” under Section 8, which may cover activities in the public interest. However, since journalism is not explicitly listed, publishers must carefully justify their data processing under these exemptions, ensuring a balance between the right to privacy and the right to freedom of expression. This requires media houses to develop robust internal protocols for data handling, especially when dealing with sensitive stories or whistle-blowers, where seeking explicit consent may not be feasible.
Question 4: What are the key challenges faced by digital news platforms and social media intermediaries in complying with both the IT Act, 2000 (as amended) and the DPDP Act, 2023 simultaneously?
Answer: Digital news platforms and social media intermediaries face a dual regulatory burden that requires navigating often overlapping and sometimes conflicting obligations under both the IT Act, 2000 (particularly the Intermediary Guidelines of 2021) and the DPDP Act, 2023. Under the IT Act, they must comply with due diligence requirements, including proactive content moderation, grievance redressal with strict timelines, and in some cases, traceability of message originators, to retain safe harbour immunity. Simultaneously, under the DPDP Act, they are classified as data fiduciaries, requiring them to obtain granular consent for data processing, ensure data minimization, provide rights of access and erasure to users, and implement strong security safeguards. A key challenge lies in reconciling content moderation mandates with data privacy rights. For instance, if a user requests erasure of their personal data under the DPDP Act, but that data is part of a post that has been reported under the IT Act’s grievance mechanism, the platform must balance legal preservation requirements with the right to erasure. Additionally, the IT Act’s emphasis on data retention for law enforcement purposes may conflict with the DPDP Act’s storage limitation principle. Operational costs also rise significantly, as platforms need sophisticated systems for consent management, data governance, and compliance reporting. For smaller digital news outlets, these overlapping requirements may create unsustainable compliance costs, potentially stifling innovation and diversity in the digital media space.
Question 5: How does the DPDP Act, 2023 address the issue of children’s data, and what implications does this have for digital media services and platforms popular among minors?
Answer: The DPDP Act, 2023 introduces stringent protections for children’s data, defined as personal data of individuals under the age of 18. Section 9 of the Act mandates that data fiduciaries must obtain verifiable consent from a parent or legal guardian before processing any child’s personal data. Furthermore, it prohibits tracking, behavioral monitoring, or targeted advertising directed at children, and bars processing that could harm the well-being of a child. For digital media services popular among minors, such as social media platforms, streaming services, gaming apps, and educational websites, these provisions have profound implications. Platforms must implement robust age-gating and age-verification mechanisms to identify underage users, a technically and privacy-sensitive challenge. They must also redesign their data processing activities to eliminate profiling and targeted advertising for children, which may disrupt revenue models built on engagement and personalized content. Platforms cannot use data in ways that may negatively affect a child’s mental health, which requires careful curation of content and algorithms. Compliance demands significant investment in child-friendly design, parental consent management systems, and ongoing monitoring to prevent inadvertent processing of children’s data. Failure to adhere can result in severe penalties, making it imperative for media platforms to prioritize child data protection as a core component of their operational and ethical framework.
Disclaimer: The content shared in this blog is intended solely for general informational and educational purposes. It provides only a basic understanding of the subject and should not be considered as professional legal advice. For specific guidance or in-depth legal assistance, readers are strongly advised to consult a qualified legal professional.