Independent Research and Policy Advocacy

An Analysis of ‘Harm’ defined under the draft Personal Data Protection Bill, 2018

Save Post

Update: This post is based on the provisions of the Personal Data Protection Bill, 2018. A new Personal Data Protection Bill 2019 was introduced in the Lok Sabha in the Winter Session of the Parliament in December 2019. However, this post is not affected as the provisions it is based on have remained unchanged in the new Bill.

The previous post in this series discussed the nature of harm in the digital ecosystem as is currently understood in scholarship in the area. India’s draft Personal Data Protection Bill 2018 (the draft Bill) proposes a user-protection framework aimed at protecting users from harms and also proposes a definition of harm. This post examines the definition of harm in the draft Bill and its effect on other provisions of the draft bill.

Our analysis leads us to caution against making provisions for user protection contingent on the occurrence of harm in the draft Bill since this can make the protections completely ineffective in practice. In addition, the ambiguity in the definition could spill over into the connected provisions introducing uncertainties for consumers, regulators and providers alike.

Indians are rapidly increasing their participation in the digital economy. We consume the highest volume of data per smartphone, globally according to some industry estimates (Ericsson, 2019). We also generate enormous amounts of data (close to 40,000 petabytes) every month (ETTech, 2018). It is crucial to complement this rapid digitisation of the economy with adequate and appropriate safeguards for consumers to achieve the welfare gains associated with the digital economy.

The previous post in this series discussed the potential for users to be exposed to serious harms such as identity theft and fraud if their data is not processed securely. Bad experiences arising from these harms can dissuade users from using digital services and hinder their access to the formal financial system which is digitizing at a fast pace (Medine, 2016). Given this background, this post will examine the definition and usage of the term “harm” in India’s draft Personal Data Protection Bill 2018 (the draft Bill). First, the definition of harm under the draft Bill is reviewed. This is followed by an analysis of its effect on other provisions in the draft Bill. The post concludes by examining the implications of this arrangement on the overall effectiveness of the proposed regulatory regime.

1. An analysis of the definition of harm in the draft Bill

Harm is generally understood as anything which worsens the condition of a person. Lawmakers limit this broad definition to make it narrower, context-specific and actionable for regulation. The unique nature of digital harms makes them harder to define. Scholarship on data protection and regulators across jurisdictions are divided on how to define harm (IAPP, 2014).

Harm is defined under section 3(21) of the draft Bill as (Ministry of Electronics & Information Technology, 2018):

“Harm includes –

(i) bodily or mental injury;

(ii) loss, distortion or theft of identity;

(iii) financial loss or loss of property;

(iv) loss of reputation, or humiliation;

(v) loss of employment;

(vi) any discriminatory treatment;

(vii) any subjection to blackmail or extortion;

(viii) any denial or withdrawal of a service, benefit or good resulting from an evaluative decision about the data principal;

(ix) any restriction placed or suffered directly or indirectly on speech, movement or any other action arising out of a fear or being observed or surveilled; or

(x) any observation or surveillance that is not reasonably expected by the data principal.”

Though we commend the draft Bill for attempting a definition of harm, it appears that this definition could upset the effectiveness of the proposed regulatory regime.

Harm, in the draft Bill, is caused when a data principal* suffers from one or more of the ten outcomes listed in section 3(21). The use of the word “includes” in the definition indicates that the definition is not limited to the ten outcomes which are listed in the section (Commercial Taxation Officer, Udaipur v Rajasthan Texchem Ltd, 2007).

When such a definition is created it is called an “inclusive definition” which must be interpreted in line with the rule of ejusdem generis. This simply translates to “of the same kind” in Latin. This rule of legal interpretation is used where a definition includes a list of items and indicates that other items “of the same kind” as those in the list may also be included in the definition.

This approach relies on a common thread in the items listed in such an inclusive definition such that they all belong to the same class of objects (Legal Information Institute, n.d.). The rule helps in identifying the common thread in all listed items, and decide whether an unlisted item can be counted as being in the same class of items (and therefore part of the definition) or not (Legal Information Institute, n.d.). For example, if a law defines a term as “including apples, mangoes, pineapples and banana,” an interpreter can infer that the definition includes other fruits like watermelon and pomegranate.

The definition of harm in the draft Bill includes a list of ten outcomes, but no guidance on how to interpret this definition and expand it. There is also no common thread across these categories. The definition in the draft Bill covers a variety of harms including:

  • all kinds of physical, mental and emotional injury,
  • all kinds of injury to property or reputation,
  • all kinds of interference with constitutional rights, and
  • other distinct outcomes which do not strictly fall into any of the other types.

These different types of harms are not clearly connected to each other by connected an underlying principle or trend.

Further, there is no indication that these harms are linked to the misuse of personal data. For instance, any discriminatory treatment is considered harmful under the definition (Dvara Research, 2018). This creates ambiguity in understanding if negative outcomes from legitimate use of personal data can be treated as harm. This open-ended construction would make it impossible to identify outcomes which must be treated as harm.

We raised similar concerns in our response to the draft Bill (Dvara Research, 2018). In the absence of clearly articulated criteria of inclusion and exclusion in the definition of harm, the thorny issue of identifying harm will remain unresolved. It will be hard for users to identify and demonstrate that they have been harmed by the providers. Regulators will be uncertain about when and how they must intervene and provide redressal. Likewise, providers will also find it hard to design their practices in a manner that avoids causing harm, if harm itself is not clearly defined. As a result, the proposed regulatory framework risks losing its effectiveness.

2. Interlinkages with other provisions in the draft Bill

The draft Bill provides for eleven transparency and accountability measures to help the future Data Protection Authority (DPA) track data fiduciaries’** compliance with their obligations. More importantly, they help the DPA identify when its intervention is required to protect data principals from harm.

However, some key obligations are only triggered in the draft Bill only if “harm” has occurred. Given the shortcomings in the definition of harm, this could result in uncertainty on whether these provisions are applicable or not.

In particular, we discuss the link between harm and two key provisions related to personal data breach notification and grievance redress.

a) The implication for data breach notification obligations: Under section 32 of the draft Bill, data fiduciaries are obliged to notify the DPA of any breach of personal data which they are processing if the breach has harmed or is likely to harm data principals. Breach notifications are an important measure in the aftermath of a breach, which may require a consumer to take action to protect themselves from dangers as a result of the loss or misuse of their breached personal information.

The requirement of “harm” as a pre-condition for breach notification in the draft Bill is problematic for several reasons. First, it requires data fiduciary makes a subjective assessment that a breach is likely to cause harm (Dvara Research, 2018). Second, this means that even largescale breaches would not require notification if no “harm” has yet manifested. Finally, owing to the broad and vague definition of harm in the draft Bill, data fiduciaries will find it difficult to ascertain if harm occurred.

b) The implication for grievance redress: Under section 39 of the draft Bill, data principals can seek redress as a result of a violation of provisions of the draft Bill. However, this provision also has a pre-condition that users must have suffered harm or be likely to suffer harm before they can seek redress.

This construction precludes users from raising grievances when (Dvara Research, 2018)-

i. a violation of the draft bill has taken place without corresponding harm, or

ii. a harm has been caused but there is no violation of the Act.

This is problematic for two main reasons. First, since harms from misuse of personal data are not easily discernible (Prasad, 2019), expecting users to identify that they have been harmed and trace the harm to a particular entity can raise impossible conditions to access grievance redress. Second, users need to have a thorough understanding of the draft Bill, their rights and their obligations to know if a provision has been violated or if they have been harmed. This imposes a high burden of proof for users who may be unable to grasp these provisions because of limited literacy or education (Dvara Research, 2018).

The draft Bill must not link violation and harm for grievance redress. Users must be able to seek redress when they suffer harm or are likely to suffer harm without having to prove a violation of the draft Bill. Similarly, they must be able to seek redress when a provision is violated even when there is no prospect of harm. Seeking redress for harm and violating the draft Bill must be independent of each other (Dvara Research, 2018).

3. Conclusion: Implications for the effectiveness of the proposed regulatory regime.

The definition of harm in the draft Bill suffers from several weaknesses. Yet, fifteen provisions in the draft Bill are currently predicated on the definition of harm. Making harm a pre-condition for triggering provisions in the draft Bill is very problematic. Data fiduciaries may defer their obligations under the draft Bill until they are certain that harm has occurred or that it is likely to occur. Data principals may either defer or refrain from enforcing their claims under the draft Bill. This could lead to instances where harms are ignored and left unremedied until they cause damage to the user.

This situation can be avoided by ensuring that protections for users in the draft Bill do not pivot on the occurrence of harm. Instead, data fiduciaries and regulators should make best efforts to ensure that personal data is not processed or stored in a manner that can harm users. Data fiduciaries’obligations should not depend on proof of the existence of “harm”, a concept that is still evolving when it comes to data protection. The regulatory framework should actively enable users’ protection rather than raise barriers to effective data protection by linking data fiduciaries obligations on unclear and evolving concepts like “harm”.


  • Bhattacharya, A. (2018, June 21). India’s internet pentration is actually way lower than you’d think. Retrieved December 11, 2018, from Quartz India:

  • Commercial Taxation Officer, Udaipur v Rajasthan Texchem Ltd , Appeal (C) 177 of 2007 (Supreme Court of India January 12, 2007).

  • Committee of the Experts under the chairmanship of Justice Srikrishna. (2018). Free and Fair Digital Economy. Government of India.

  • Dayal, A. (2012, November 15). Inside Law: How Defamation Works in India. Retrieved from The Wall Street Journal:

  • de Laat, P. B. (2017, November 12). Algorithmic Decision-Making Based on Machine Learning from Big Data: Can Transparency Restore Accountability? Retrieved from Springer Link:\

  • Dvara Research. (2018, October 10). Our response to the draft Personal Data Protection Bill, 2018. Retrieved December 13, 2018, from Dvara Research:

  • Dvara Research. (2018). The Data Protection Bill, 2018. Retrieved December 13, 2018, from Dvara Research:

  • Ericsson. (2019). Ericsson Mobility Report . Fredrik Jejdling.

  • ETTech. (2018, August 17). Data generated in India throws a massive opportunity for startups: Amitabh Kant. Retrieved December 11, 2018, from ETtech:

  • IAPP. (2014, April). The Evolving Nature of Consumer Privacy Harm. USA. Retrieved from

  • Kaka, N., Madgavkar, A., Kshirsagar, A., Gupta, R., Manyika, J., Bahl, K., & Gupta, S. (2019). Digital India: Technology to transform a connection nation. McKinsey Global Institute.

  • Legal Information Institute. (n.d.). Ejusdem Generis. Retrieved from Cornell Law School:

  • Medine, D. (2016, November 15). Making the Case for Privacy for the Poor. Retrieved December 11, 2018, from CGAP:

  • Ministry of Electronics & Information Technology. (2018, July 27). Personal Data Protection Bill, 2018. Retrieved January 16, 2019, from Ministry of Electronics & Information Technology:

  • Moneycontrol. (2017, August 17). DATA STORY: How India has turned into the world’s second largest online market. Retrieved December 11, 2018, from Moneycontrol:

  • Newman, L. (2018, December 7). The Wired Guide to Data Breaches. Retrieved May 22, 2019, from Wired:

  • Nielsen. (2018, September 26). Average Indian Smartphone User Spends 4X Time on Online Activities as Compared to Offline Activities. Retrieved December 26, 2018, from Nielsen:

  • Prasad, S. (2019, March). Defining “Harm” in the digital ecosystem. Retrieved from Dvara Research :

  • PTI. (2019, March 06). Internet users in India to reach 627 million in 2019: Report. Retrieved from The Economic Times:

  • Raghavan, M. (2018, January 15). Before the Horse Bolts. Retrieved December 13, 2018, from Dvara Research:

  • Singh, D. (2019, January 22). How will next billion users coming online use Internet? Retrieved from Business Today:

  • Solove, D. J., & Citron, D. K. (2016). Risk and Anxiety: A Theory of Data Breach Harms. Retrieved from SSRN:

  • Srikara, P. (2019, May 6). Defining “Harm” in the digital ecosystem. Retrieved from Dvara Research:

  • * A term introduced in the draft Bill to refer to individuals whose personal data is processed under the draft Bill.

  • ** The entities which decide the purpose and means of processing personal data under the draft Bill.

Authors :

Tags :

Share via :

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts :