Saturday, April 5, 2025
HomeOpinionThe ‘parental consent’ problem with the draft rules of the DPDP Act

The ‘parental consent’ problem with the draft rules of the DPDP Act


‘The draft rules adopt a one-size-fits-all approach, not taking into account the hierarchy of risks across digital platforms’
| Photo Credit: Getty Images

Personal data is central to digital economies, enabling access to a range of services while also creating vulnerabilities. Data protection laws have been seen as safeguards against misuse. However, recent regulations are increasingly addressing the needs and also the issue of the safety of children in digital spaces. The Indian Digital Personal Data Protection Act, 2023 (DPDP Act) acknowledges that children under the age of 18 years require enhanced protection for their personal data due to their limited capacity to navigate digital risks. To this end, the law mandates platforms to obtain verifiable parental consent before collecting a minor’s personal information. Though well-intentioned, it raises concerns about unforeseen consequences, potentially impacting, negatively, the rights, the privacy and the security of both minors and adults.

Approach to obtaining parental consent

Unlike global practices which allow greater flexibility, Draft Rule 10 of the DPDP Act outlines only two methods for obtaining parental consent on digital platforms. It states that if parents are existing users, platforms can use their previously collected information for verifiable parental consent. If not, it must be done through a digital locker service or any government authorised entity. In addition, platforms must exercise ‘due diligence’ to confirm that anyone claiming to be a parent is an identifiable adult for legal compliance. However, it does not offer clarity on how to establish parent-child relationships. The draft rule implies that service providers could be held liable if minors access the platforms without parental consent. This is being reinforced by requirements to prevent children from accessing harmful content. One interpretation suggests these requirements may compel platforms to verify the age of all users.

If digital platforms struggle to implement parental consent with certainty, they may resort to aggressive data collection to avoid liability. For instance, if a child’s preferred platform is different from the one the parent uses, the platform may require the parent to register first to comply with regulations. However, it creates a situation where a parent may be forced to join platforms they do not even wish to use, leading to unnecessary data collection. Such measures conflict with the principles of data minimisation and purpose limitation. If parents choose not to register, they may have to provide consent through mechanisms such as DigiLocker. This could exclude users who are unable or unwilling to provide such identification due to privacy concerns. Despite good intentions, the prescribed approach may negatively impact access to digital services such as communication, entertainment, art, and gaming for the very individuals it aims to protect and put parents’ sensitive data at risk of unnecessary disclosure.

The draft rules adopt a one-size-fits-all approach, not taking into account the hierarchy of risks across digital platforms. For example, high risk platforms may allow anonymous interaction that may expose minors to threats such as harassment, while engagement-driven services can push towards excessive screen time for minors. In contrast, others may incorporate safeguards such as not recommending harmful content, usage limits and transparent data processing, which reduces digital risks for children. However, the draft rules require all service providers to obtain parental consent through prescribed modes, requiring sensitive data or government-authorised data. This broad requirement complicates compliance and raises costs, including application programming interface (API), staff training, data collection, storage, and verification. Subsequently, these costs would shift to consumers, making services less accessible to minors.

A better way out

A more helpful approach would be to avoid specifying particular methods, data sources, or technical solutions to ensure efficiency and privacy protection. This would enable innovation and allow service providers to implement parental consent, such as using third-party services in a way that suits their context. This should be guided by three key factors, namely, nature of the use case; the level of associated risks, and implementation challenges such as cost, scalability, convenience, and privacy. For lower-risk activities such as accessing general information, viewing non-sensitive content, or engaging in basic user interactions such as writing product reviews, simpler verification methods including self-declaration may be appropriate. In contrast, high-risk activities including access to age-restricted content such as alcohol, online dating, and financial transactions necessitate more robust verification, such as government issued documents.

Have an independent assessment body

Regulators should also use complementary strategies such as privacy and safety-by-design principles, which can make the design of technologies safer for minors. This includes developing age-appropriate standards, disabling intrusive or risky design features such as location tracking, private messaging, commercial targeting, and ensuring that high privacy settings are the default. Service providers should be mandated to protect the youngest users within their stated age range by maintaining appropriate safety standards. To centre children’s best interests, an independent assessment body should be established to assess and mitigate risks associated with the data collection and processing of minors in a specified time frame. Clear rules on data use, transparent oversight and strong enforcement are essential for accountability.

We need a parental consent framework that balances a platform’s needs, transparent mechanisms for trust, and incentives for innovation in child-friendly digital spaces.

Asheef Iqubbal is a technology policy researcher at CUTS International



Source link

RELATED ARTICLES

Most Popular

Recent Comments