The following is the first part of a two-part series that undertakes an analysis of the Account Aggregator system. These pieces were written as part of the blog series on Account Aggregators by NALSAR Tech Law Forum.
The Reserve Bank of India (RBI) released Master Directions on Non-Banking Financial Companies – Account Aggregators (Master Directions) in September 2016, and licences for India’s first Account Aggregators (AAs) were issued last year. From these guidelines and related documents, we understand that the purpose of Account Aggregator (AA) is to collect and share:
- consumers’ financial information (FI) with their consent,
- by securely intermediating between entities requesting their data (the Financial Information Users (FIUs)), and
- entities who hold and share the consumers’ data (the Financial Information Providers (FIPs)).
Given that the AA infrastructure is aimed at harnessing the value of consumer’s personal data, does it sufficiently protect them during the data sharing process? We will consider some answers to this two-part blog series. In this post, we consider the motivations for the AAs, and specifically look at the consumer protection concerns if consent becomes the main strategy for user protection in a data sharing infrastructure.
1. The motivation for the Account Aggregator system: Breaking down data siloes
The key motivation for AAs appears to break down siloes of data and enable encrypted sharing of data between firms, taking the consent of the consumer. The appeal of AAs is that they can provide of one-tap information for financial service providers. This can drive down the supply-side costs of disbursing credit. These costs include transaction costs relating to customer identification, data gathering and due diligence as well as related costs of staff and offices to undertake data gathering activities (George & Sahasranaman, 2013). Building this kind of infrastructure for sharing consumer data could reduce the cost of credit disbursement for every additional unit of credit (i.e. for every loan) that arise using physical documentation and customer data collection to a certain extent. This cost saving could apply to all financial products since most require some degree of customer verification and due diligence documentation.
2. So AAs could drive down costs in the chain of financial services supply. What is in it for consumers?
2.1. Potential benefits: Swifter services & lower costs
From the consumers’ perspective, the swifter sharing of their information may enable quicker service delivery. Financial benefit for consumers will depend on whether entities pass on the cost savings they may make or use data effectively to provide more suitable products and services.
Institutions who are seeking to use the AA system will need to invest resources to enable the kind of data-sharing that is envisioned. Additionally, AAs themselves will need to be built securely and effectively. The incentive for entities to participate in this system will be dependent on the quality of the data as well as how secure the AA ecosystem is. To protect consumers whose data is being shared, it is also important to consider if the architecture envisioned in the AA is sufficiently accountable, privacy-protecting and consumer-friendly.
2.2. Potential risks: Consent is a good first step, but recognised to be an ineffective tool of empowerment for consumers when it comes to data sharing
The AA system adopts a system of obtaining user consent before data-sharing. This system is called the Data Empowerment and Protection Architecture (DEPA).
The DEPA framework developed by Indiastack that allows users to control the sharing and usage of their personal data amongst various services and entities. The framework also maintains the necessary safeguards for the privacy of the data, i.e. the data is used and shared only to the extent that the user allows it for a stipulated purpose. The framework is designed to be transparent, and all transactions will be made traceable and auditable (Indiastack, n.a.).
From the description above, it appears that the DEPA framework seeks to put individuals on notice that their data is being requested. It then asks for their granular consent on whether particular information records about them can be shared or not shared. This form of permission-based data sharing is known as the “Notice and Consent” model in data protection scholarship (Kemp, 2017). Under this approach, the consumer is provided with notice prior to their data being collected, informed of how it might be used in the future and takes consent for the use of the information from the consumer.
Although this might appear to grant agency and autonomy to users, in practice the Notice and Consent is known to have many failings that fails to protect users’ interests, and in most cases presents a false choice to users.
Cognitive limitations operate on individuals’ decision- making about their personal data
Personal data is intangible and its use results in benefits and harms which are not immediately apparent to users. Research is beginning to show that there are severe cognitive limitations that impair individuals’ ability to make informed and rational choices about the costs and benefits of consenting to the collection, use, and disclosure of their personal data (Solove, 2013). Acquisti (2004) has highlighted how immediate gratification bias — due to which individuals overvalue the immediate period as compared to all future periods – can cause individuals to make suboptimal privacy decisions. As a result of this bias, individuals have been shown to choose to receive an immediate gain from data sharing (or avoid immediate costs of protecting data) and discount costs of possible future risks. They are unable to process the effect of cumulative risk over all future periods (Acquisti, 2004).
The threat of denial of service can make “taking consent” a false choice
A major structural issue today is the binary nature of the choice consumers: they can either “agree” to the terms of on which their data is collected under providers’ privacy notices or disagree and be denied the service. This “take it or leave it” scenario leaves consumers with no real choice and impacts how individuals behave when agreeing to the terms under which their personal data is collected, processed and shared (Bailey, Parsheera, Rahman, & Sane, 2018).
3. Conclusion: DEPA must be supported by other data accountability features, and also become meaningful for non-smartphone users
The concept of Account Aggregators solves for a significant problem of financial data aggregation for individuals and small and medium enterprises alike. It also appears to take consumer data protection & privacy seriously for the first time, unlike previous large architectures. However, to be effective it must be supported by strong accountability systems and access controls that operate independent of consent. Relying solely on consent is not a good idea – as a wealth of data protection and consumer protection thinking has shown that Consent is necessary but not sufficient for data protection.
A broader concern is that a highly smartphone dependent user interface such as DEPA may not address the needs of the substantial feature phone audience in India who may not have access to reasonably good internet connections and electricity. As Account Aggregators gain traction in the financial community, deliberations about its impact of different demographics of the population become important, especially as it comes during a time of increased promotion of a more digital India. Consequently, it is important to improve on consent interfaces for feature phones. This must be done with the knowledge that consent is a necessary but insufficient safeguard for users’ data.
One significant lesson from our last decade of building public data infrastructures in India has been that such projects must carefully consider potential vulnerabilities in the system that could expose people to harm if their data is compromised, misused or inaccurately recorded. In India, we would do well to learn from our past experience with creating large public infrastructure that collates personal information of individuals. The implementation of a new large data infrastructure must ensure that data protection & privacy concerns that have been raised in the past are not replicated, or worse, magnified (Omidyar Network, 2018; Khera, 2019; Sharma, 2019).
It is heartening to see that DEPA is one step towards doing so, but many more features would be necessary to ensure privacy and security by design. How does the AA infrastructure fare when analysed against Privacy By Design principles? We will analyse this in our next post.
In the second post of this series, we undertake an analysis of the technical standards and specifications present across publicly available documents on Account Aggregators. We map the technical standards to the seven principles of Privacy by Design (PbD) and deliberate on the privacy and data protection afforded by the architecture of AAs.
Bailey, R., Parsheera, S., Rahman, F., & Sane, R. (2018, December 11). Disclosures in privacy policies: Does “notice and consent” work? Retrieved from NIPFP Working Paper Series: https://www.nipfp.org.in/media/medialibrary/2018/12/WP_246.pdf
George, D., & Sahasranaman, A. (2013, April). Cost of Delivering Rural Credit in India. Retrieved from Dvara Research: https://dvararesearch.com/wp-content/uploads/2013/04/Cost-of-Delivering-Rural-Credit-in-India.pdf
Indiastack. (n.a.). ABOUT DATA EMPOWERMENT AND PROTECTION ARCHITECTURE (DEPA). Retrieved from Indiastack: https://indiastack.org/depa/
Kemp, K. (2017, August 22). Big Data, Financial Inclusion and Privacy for the Poor. Retrieved from Dvara Research: https://dvararesearch.com/2017/08/22/big-data-financial-inclusion-and-privacy-for-the-poor/
 According to section 3(1)(ix) of the Master Direction- Non-Banking Financial Company – Account Aggregator (hereafter, NBFC-AA Master Directions), financial information (FI) is defined as, “information in respect of the following with financial information providers: a) bank deposits including fixed deposit accounts, savings deposit accounts, recurring deposit accounts and current deposit accounts, b) Deposits with NBFCs c) structured Investment Product (SIP) d) Commercial Paper (CP) e) Certificates of Deposit (CD) f) Government Securities (Tradable) g) Equity Shares h) Bonds i) Debentures j) Mutual Fund Units k) Exchange Traded Funds l) Indian Depository Receipts m) CIS (Collective Investment Schemes) units n) Alternate Investment Funds (AIF) units o) Insurance Policies p) Balances under the National Pension System (NPS) q) Units of Infrastructure Investment Trusts r) Units of Real Estate Investment Trusts s) Any other information as may be specified by the Bank for the purposes of these directions, from time to time” (Reserve Bank of India, 2016).
 Section 3(1)(xii) of the NBFC-AA Master Directions.
 Section 3(1)(xi) of the NBFC-AA Master Directions.
 According to the Report of the Committee on Financial Inclusion released in January 2008, identifies cost of transactions for small credit accounts as a percentage of loans (for up to INR 25,000). Broadly, some of these process steps include (i) selection of applicants, (ii) carrying out post-sanction inspections (iii) establishment costs (iv) documentation costs etc. The transaction cost of credit as a percentage of the loan amount is found to be 12.95% and 8.62% in observations from Central Bank of India and ICICI Bank respectively (C. Rangarajan Committee on Financial Inclusion, 2008).
 Immediate gratification bias is closely related to the concept of Hyperbolic Discounting but differs slightly. Hyperbolic discounting discounts utilities from future periods more heavily the further away the period is, while immediate gratification bias states that individuals disproportionately overvalue the present period and all future periods are discounted uniformly.