NHRC issues notices over alleged DPDP Act violations by AI, social media, edtech platforms

The Confluence of Privacy and Human Rights: NHRC’s Proactive Stance on DPDP Act Violations

In a landmark development that underscores the evolving landscape of digital rights in India, the National Human Rights Commission (NHRC) has recently issued comprehensive notices to several key government ministries. This action stems from alleged violations of the Digital Personal Data Protection (DPDP) Act, 2023, by Artificial Intelligence (AI) systems, social media giants, and Educational Technology (EdTech) platforms. As a Senior Advocate observing the intersection of technology and constitutional mandates, I view this move not merely as a regulatory hurdle for tech companies, but as a significant step toward recognizing data privacy as an inseparable facet of human dignity and the Right to Life under Article 21 of the Indian Constitution.

The bench, led by NHRC member Priyank Kanoongo, initiated these proceedings following a detailed complaint rooted in an investigative report by the think tank ASIA (Alliance for Sustainable & Inclusive Africa/Asia). The notices have been dispatched to the Ministry of Electronics and Information Technology (MeitY), the Ministry of Education, and the Ministry of Communications, with copies marked to the Ministry of Home Affairs. This multifaceted approach suggests that the NHRC perceives the breach of data privacy not as an isolated technical glitch, but as a systemic threat affecting education, national security, and civil liberties.

Deconstructing the Allegations: AI, Social Media, and the Vulnerable EdTech Sector

The crux of the complaint involves the pervasive and often opaque manner in which digital personal data is harvested and processed. AI platforms, which rely on gargantuan datasets for training, have frequently been accused of “scraping” personal information without explicit or informed consent. In the Indian context, the DPDP Act 2023 was specifically enacted to curb such practices. However, the report by ASIA highlights that many AI deployments continue to operate in a legal gray zone, often prioritizing algorithmic efficiency over the “Right to be Forgotten” or the “Right to Correction.”

The EdTech Predicament

Perhaps the most sensitive area touched upon by the NHRC is the EdTech sector. During and after the pandemic, India saw a meteoric rise in digital learning platforms. While these platforms have democratized access to education, they have also become massive repositories of data concerning minors—one of the most vulnerable demographics. The DPDP Act mandates “verifiable parental consent” for processing the data of children. The allegations suggest that several EdTech companies have bypassed these safeguards, using aggressive tracking mechanisms to profile students for commercial gains, thereby infringing upon their right to a safe and private developmental environment.

Social Media and Algorithmic Bias

Social media platforms remain the primary battleground for data sovereignty. The NHRC’s involvement highlights concerns regarding how these platforms utilize personal data to manipulate user behavior through “dark patterns” and opaque algorithms. When a platform processes data to influence a user’s psychological profile or political leaning without transparent disclosure, it moves beyond a simple privacy violation into the realm of a human rights violation, affecting the freedom of thought and expression.

The Legal Foundation: The DPDP Act 2023 and the Puttaswamy Doctrine

To understand the gravity of the NHRC’s notices, one must look at the legal bedrock upon which these claims are built. The Digital Personal Data Protection Act, 2023, was the culmination of years of judicial and legislative deliberation, following the historic Supreme Court judgment in Justice K.S. Puttaswamy (Retd.) v. Union of India. The Court unequivocally declared that privacy is a fundamental right. The DPDP Act was designed to provide a statutory framework to protect this right in the digital age.

Key Provisions Under Scrutiny

The NHRC’s inquiry likely focuses on several key pillars of the DPDP Act. First is the concept of the ‘Data Fiduciary’—the entity that determines the purpose and means of processing personal data. Under the Act, fiduciaries are held to a high standard of accountability. They must ensure that data is processed only for a “lawful purpose” and for which the “Data Principal” (the individual) has given consent. The notices suggest that AI and social media companies may be failing in their duty to provide “clear and plain language” notices to users, a mandatory requirement under Section 5 of the Act.

Furthermore, Section 9 of the Act specifically addresses the processing of personal data of children, prohibiting any processing that is likely to cause a “detrimental effect” on the well-being of a child. By issuing notices to the Ministry of Education, the NHRC is signaling its intent to scrutinize whether EdTech platforms are exploiting the data of young learners for targeted advertising or psychological profiling, which would be a direct contravention of the Act.

The NHRC’s Jurisdiction: Why a Human Rights Body?

A common question in legal circles is why the NHRC, rather than just the Data Protection Board (DPB), is taking such an active role. The answer lies in the intrinsic link between data and personhood. In the modern era, our digital footprints are extensions of our physical selves. When a state or a private corporation misuses data, it can lead to discrimination, loss of livelihood, or even physical threats through surveillance. Under the Protection of Human Rights Act, 1993, the NHRC has the mandate to intervene in matters where the state’s failure to regulate leads to a violation of constitutional rights.

By involving MeitY and the Ministry of Communications, the NHRC is questioning the adequacy of the current regulatory oversight. If the government fails to enforce the DPDP Act effectively, it can be seen as “state inaction,” which falls squarely within the NHRC’s investigative purview. This is a significant escalation; it moves the conversation from “regulatory compliance” to “human rights accountability.”

The Role of Government Ministries: A Multi-Departmental Challenge

The distribution of notices to multiple ministries reflects the complex nature of the digital ecosystem. Each ministry holds a piece of the puzzle, and the NHRC is demanding a cohesive response.

Ministry of Electronics and Information Technology (MeitY)

As the primary architect of the DPDP Act and the IT Rules, MeitY is responsible for the technical implementation and the establishment of the Data Protection Board. The NHRC is likely seeking clarity on why the enforcement mechanisms have not yet deterred the alleged violations by AI and social media platforms. The delay in fully operationalizing the Data Protection Board has created a regulatory vacuum that these platforms are allegedly exploiting.

Ministry of Education

The Ministry of Education’s involvement is critical for the EdTech sector. The NHRC wants to know what guidelines are in place to ensure that schools and private learning apps are not compromising student data. There is a growing concern that “Digital India” initiatives in education must be balanced with robust “Data Privacy” protocols to prevent the commercial exploitation of the youth.

Ministry of Communications and Ministry of Home Affairs

The Ministry of Communications oversees the infrastructure through which data flows. Its role in ensuring data localization and secure transmission is paramount. Meanwhile, the Ministry of Home Affairs (MHA) is involved due to the national security implications. Data breaches and the unauthorized processing of Indian citizens’ data by foreign-linked AI or social media entities can pose significant risks to the internal security and sovereignty of the nation.

The ASIA Report: A Catalyst for Action

The report by the think tank ASIA appears to be a comprehensive dossier documenting systemic failures. While the specific details of the report are pending full public disclosure in court proceedings, it is understood to highlight “surveillance capitalism” practices. This involves the use of AI to predict and influence human behavior for profit without the individual’s knowledge. In the context of the DPDP Act, this is often done through “deceptive design” where consent is buried in lengthy, incomprehensible terms and conditions—a practice the Act explicitly seeks to end.

The ASIA report also likely touched upon the lack of transparency in AI training models. Large Language Models (LLMs) often ingest personal data from the public web, which may include sensitive personal information that was never intended for AI training. The NHRC’s notice asks the government to explain how the DPDP Act’s “right to erasure” is being enforced when data is swallowed into an AI’s black-box neural network.

Challenges in Enforcement and the Road Ahead

As a Senior Advocate, I anticipate several challenges in this legal journey. The DPDP Act is still in its nascent stages of implementation. The rules under the Act are being finalized, and the Data Protection Board is not yet fully functional as a quasi-judicial body. This transition period is fraught with uncertainty. Companies often argue that in the absence of finalized rules, they cannot be held strictly liable for every nuance of the Act.

However, the NHRC’s intervention reminds us that fundamental rights do not wait for “rules” to be framed. The constitutional right to privacy is immediate and absolute. The tech giants—referred to as ‘Significant Data Fiduciaries’—have a proactive duty to protect user data from the moment the Act was notified. The defense of “technical complexity” is unlikely to hold weight when the rights of millions of citizens are at stake.

The Necessity of Global Standards

India is not alone in this struggle. The European Union’s GDPR and the newly enacted AI Act serve as global benchmarks. The NHRC’s notices will likely force the Indian government to align its enforcement actions with these global standards, ensuring that India is not seen as a “data haven” where companies can operate with lower privacy safeguards than they do in the West.

Conclusion: A Call for Digital Accountability

The NHRC’s decision to issue notices over DPDP Act violations is a watershed moment for Indian digital jurisprudence. It signals to the global tech community that India is serious about its data sovereignty and the protection of its citizens’ rights. For AI developers, social media platforms, and EdTech firms, the message is clear: the era of unchecked data harvesting is over. Compliance is no longer an option; it is a human rights mandate.

As we await the responses from MeitY, the Ministry of Education, and other bodies, the legal community must remain vigilant. This case will likely set the precedent for how the DPDP Act is interpreted in the context of emerging technologies. It is an opportunity for India to lead the way in creating a digital ecosystem that is not only innovative and economically vibrant but also deeply rooted in the principles of fairness, transparency, and respect for human dignity.

For the common citizen, this action by the NHRC offers a glimmer of hope. It reaffirms that in the vast, complex world of algorithms and big data, the individual’s right to privacy remains paramount. As advocates, it is our duty to ensure that the law keeps pace with technology, ensuring that the “Digital” in Digital India always serves the “People” first.