Introduction and Background
In today’s Information Age, we’re witnessing a growing trend where various entities collect, share, and process the personal data of children for various purposes. Unfortunately, this often happens in a way that’s hard for kids and their parents or guardians to notice or fully understand. This raises concerns about the potential exploitation and misuse of children’s personal data. While it’s essential for children to participate in the digital and data-driven world to enjoy their rights, such as freedom of speech, privacy, and access to information, it’s equally important to regulate how their personal data is handled to ensure their safety and security.
Enter the Digital Personal Data Protection Act of 2023 (DPDP Act), a significant milestone in securing India’s digital future. This act has far-reaching implications for a wide range of stakeholders. These stakeholders can be grouped into two main categories: Data Principals (individuals whose personal data is involved, including children) and Data Fiduciaries (those who determine how personal data is processed, either individually or jointly).
Challenges with Age of Consent under the DPDP Act
The DPDP Act lays out a structured framework for managing children’s personal data by data fiduciaries. It all starts with verifying the accuracy of the data and confirming the user’s age. If the user is under 18 years old, specific measures come into play. Data fiduciaries must obtain verifiable consent from the child’s parent or guardian. Once it’s established that the user is indeed a child under 18, there are limitations on how their data can be used, including restrictions on tracking their online activities. However, implementing age verification and parental consent mechanisms can be challenging in practice, mainly due to the lack of clarity in the Act’s procedures.
- Ambiguity in the Law: It’s clear that specific rules and criteria will be provided later; for now, the Act offers a broad framework indicating the legislature’s intent to regulate such processing. However, the Act does not define or explain what “verifiable” consent means. It’s crucial for policymakers to outline best practices while considering the potential costs of adopting or modifying existing business models. The government is already developing a mechanism called DigiLocker to verify the identity of parents and their children online. But various privacy-enhancing technologies collect identification information from children after obtaining verifiable consent from parents. Therefore, the government could explore options to enhance these services to help data fiduciaries comply with this section.
- Definition of “Detrimental Effect”: The Act also lacks a definition or explanation of the phrase “detrimental effect.” While it’s broad enough to cover various behaviors, it may lead to confusion and regulatory uncertainty. Without further guidance, data fiduciaries may decide what actions could have “detrimental” effects, potentially resulting in either overly restrictive data processing or allowing some to avoid compliance due to unclear language.To address these concerns effectively, policymakers should create a list of determining factors for the “detrimental effect” definition and restrict data fiduciaries’ latitude in classifying activities. This is especially important given the forward-looking nature of the provision.
- Comparatively High Age Limit: There’s a noticeable gap in age limits that define who qualifies as a “child” in relevant data protection provisions worldwide. Several other countries have significantly lower age limits (e.g., 13 in the UK, 16 in the EU, and 13 in the US) compared to India’s 18 years.
- Children’s Access to Better Data-Driven Services: Requiring parental consent for every website can limit children’s opportunities, forcing them to seek parental consent even for basic online activities like internet searches, creating email accounts, or exploring online content. This could hinder efforts by both the government and private organizations to bridge the digital divide and provide equitable digital access. While data control is essential, equal access should also be a priority.
- Using Consent Restrictions Effectively: While consent and age-based restrictions are common tools for protecting children’s personal data from abuse and misuse, it’s worth examining the type of opportunities that data processing offers. We should ensure that users, including children, have opportunities that promote their individual and common interests.
- Implementation Challenges: Provisions like these often face implementation challenges, particularly concerning verifiable consent requirements. These requirements can be easily bypassed by providing false age information or logging into networks as a parent or guardian. Asking for consent on every website can lead to “consent fatigue” as users are constantly bombarded with consent requests, making it difficult to thoroughly review and understand each one. This could go against the intention of obtaining informed consent from parents and guardians, potentially reducing safety for children.
- Role of Consent Managers: The DPDP Act of 2023 introduces a unique stakeholder known as “consent managers” in the data ecosystem. These managers act as a central point of contact for data principals, including children and their parents or guardians, to give, manage, review, and withdraw consent through a dedicated platform. This legislative innovation empowers data principals to keep track of and manage their consent to data fiduciaries. This could be especially helpful for parents or guardians who can’t keep track of all the sites they’ve inadvertently given consent to for data processing. However, this also raises privacy concerns.
- Children’s Right to Privacy and Right to Be Heard: While these regulations are crucial for ensuring the safety and security of children, constantly requiring parental or guardian permissions for services that involve personal data processing may compromise children’s privacy. It could also exclude them from processes and decision-making, which often happens when service providers process data. This approach overlooks the fact that children of different ages have varying levels of maturity and technological competency. For instance, children nearing eighteen years of age often have a sense of agency and autonomy, along with a desire to protect their privacy and exercise their freedom of speech. Standardizing consent requirements across age groups may infringe on their freedom of expression and create a potentially exclusionary environment, as supported by empirical studies.
- Concerns with Verifying Age and Consent: The Act lacks guidance on verifying parental consent, which poses challenges in ensuring authenticity. Websites, even those not directly collecting data (such as news or entertainment sites), bear the responsibility for this. However, many platforms indirectly gather data through tracking methods like cookies. Additionally, many websites, particularly social media, prohibit users below a certain age (often 13-16), yet many children provide false information to access them. This raises questions about how effectively websites can verify user age across diverse audiences. Confirming age and consent authenticity in the digital realm is complex without specific technological measures. While services targeting those under 18 may not pose issues, platforms serving a wide age range might. While using identity verification tools like Aadhaar could help, this risks over-collection of data, including birthdate and address, which violates data minimization principles. This disproportionate data gathering could increase risks for children instead of safeguarding them.
Way Forward
Considering the concerns outlined above, we propose the following potential actions for policymakers to effectively realize the provisions of the DPDP Act:
1. Adopting a risk-based approach: To reevaluate the age limit specified in the Act, relying on empirical studies to assess the risks and potential harm associated with data processing for different age groups below 18.
2. Child-friendly communication of notice and consent procedures: Ensure that consent-related language is child-friendly, age-appropriate, and straightforward. Regulators can use graphics and visuals to convey these concepts effectively to younger audiences.
3. Mandating Impact Assessments & Internal Audits: Require Data Protection Impact Assessments (DPIAs) even for activities that seem exempted by the provision. Also, mandate comprehensive self-assessment and verification measures for data fiduciaries. This includes establishing an internal auditing mechanism to regularly review consent and parental verification processes, ensuring the use of the most effective and least intrusive technology by data fiduciaries.
Follow Kriti Singh on:
LinkedIn: https://www.linkedin.com/in/kriti-singh29/
Follow Vaishnavi Sharma on:
LinkedIn: https://www.linkedin.com/in/vaishnavi-sharma-4808b6171/