Laura Caron

By Laura Caron


It is nearly impossible to hear about digital financial services (DFS) without also hearing about big data. The term “big data” refers to datasets that are large, rapidly-changing, and/or that cover a wide variety of information. These kind of datasets have many applications in DFS, and at least 24 African fintechs use data analytics as a key part of their services or products. Data analytics offer attractive opportunities for those hoping to expand financial inclusion, but also come with many privacy and consumer protection risks. A few examples from Kenya and East Africa point out both the huge potential of new data analytics methods, along with the importance of incorporating consumer protection into these innovations.

Opportunities for Big Data

Use of big data opens up many opportunities for DFS providers to expand financial inclusion and improve both social and business outcomes. Several companies in Kenya have already begun using big data sources for tasks including customizing products to suit different client segments and alternative credit scoring methods for those without a credit history. These cases highlight the urgency of considering consumer data protection, as technologies like these develop and spread rapidly.

Using Alternative Data for Credit Scoring

Meanwhile, other fintechs working in Africa are using machine learning techniques to create alternative credit scoring methods for clients without a credit history or other existing financial records. To do this, companies collect and analyze non-traditional sources of client data, including social media, emails, and phone use records. One of the largest of these, Lenddo, which was founded in 2011 and now covers more than 15 countries with over 5,000,000 users, uses data including Facebook messages and likes, emails, social media connections, e-commerce transaction data, and psychometric data to create its LenddoScore product. (For more on Lenddo, see this case study from the IFC-Mastercard Data Analytics Handbook and this case study on a microfinance institution using Lenddo from FiDA.)  Similarly, other Kenyan fintechs such as FarmDrive, Harvesting Inc, and Apollo Agriculture target smallerholder farmers, using satellite data along with other non-traditional sources such as weather and mobile phone data to build credit scores based on yields, planting cycles, and other agronomic information (see this case study).

The credit scores generated by these services can be used by other financial institutions to evaluate risk and make lending decisions, and so have a real impact on access to credit for their clients.

Risks of Big Data

However, these methods are not without serious pitfalls for providers and clients. Use of data analytics carries the risks of violations of privacy, unauthorized disclosure or sharing of information, misuse of data and discrimination, and clients misunderstanding data collection and use, among others.

The current state of consumer data protection is lacking: CGAP interviewed 26 DFS providers and found that most did not have a plan for data retention and were not highly motivated to provide their customers with information about how their data would be used. CGAP also highlights the harms that can come with a lack of data protection and privacy: 17% of Tanzanian DFS users reported having lost money to fraud including through data breaches and other scams. Data protection represents an area where government and provider policies will need to evolve to focus on privacy rights and consumer protection.

As outlined in a World Bank discussion note on this topic, financial consumer protection in the big data context must be approached from many different angles, both external and internal to fintechs generating or using data analytics. For success of both company and client, fintechs should consider the importance of transparency around how data is collected and used, fair treatment and anti-discrimination protections, mobility and interoperability between DFS providers, and privacy and data protection. Part of these protection concerns involve making sure that consent is well-informed, storing data securely, and ensuring the accuracy and reliability of data that is used in decision-making, as well as ensuring that clients have the ability to access their own data and request changes to incorrect data about themselves. Without these kinds of protective measures, clients are at risk may be at risk of having their data used against them, such as in cases of discrimination, or they may be at risk of a violation of privacy, such as through a data breach or through misunderstanding how their data would be shared.

Addressing the Risks of Big Data

Fintechs are working on ways to incorporate consumer protection into their user experiences. In Tanzania and Kenya, First Access and Lendable, both signatories to the Investor Guidelines, have taken important strides in opening up these issues for discussion and study. These cases can serve as examples for investors or providers concerned about the risks of big data.

First Access: Designing Informed Consent

First Access is a fintech company offering a software platform for lenders to digitize their manual credit origination and analysis. Through better data collection and automation, First Access enables lenders to reach low risk customers faster. In Tanzania, where the company enabled lenders to use mobile data for credit scoring, First Access partnered with CGAP to explore ways to incorporate consumer protection, data privacy, and transparency into the loan origination. In a qualitative study, they explored clients’ responses to different ways of communicating data policies when allowing First Access to use mobile phone records to build a credit score. They found that clients cared both about what data would be collected and how it would be used. Many clients were satisfied with receiving a simple SMS message expressing that their mobile phone records would be used for the loan decision and would not be shared. However, many also said they would call for more information, read a fact sheet, or request even more details by SMS. These findings demonstrate actionable and effective ways to incorporate greater transparency in the sign-up process.

First Access believes that relationship-driven lending, as opposed to complete automation, can help reduce risk for lenders and borrowers alike for many types of credit products. Accordingly, the company helps incumbent lenders grow faster by providing user-friendly tools that allow each staff member to accelerate lending while maintaining strong relationships. This model allows institutions to offer lower interest rates than those generally available through completely remote/automated lending, while accelerating the capacity of their teams.

Lendable and FMO: Workshops on Data Protection

As signatories to the Guidelines for Investing in Responsible Digital Financial Services, African fintech Lendable has worked with FMO, the Dutch Development Bank, to host a series of workshops on data privacy and consumer protection, two of which took place this past fall, one in Nairobi and one in Lagos. During these workshops, the speakers pointed out key starting places for fintechs looking to improve consumer protection policies. They considered the role of regulation, including understanding current draft bills on data protection in Kenya and advocating for the development of new policy in Nigeria. They also worked through actions that providers can take in designing their interfaces, such as letting clients view their own records and information, setting time limits on data shared with third parties, and making sure that data integrity is protected by only allowing authorized users to access it. Building from these workshops, Lendable aims to continue the conversation on consumer protection and data privacy through encouraging fintechs to comment on draft consumer protection legislation in Kenya.


DFS providers must recognize the importance of data protection for building trust, expanding their client base, and reducing risk. In order to do this, they should offer transparency around what data is collected and why, as well as make sure consumers can access and understand their own data. They should also make sure data is kept private and secure. Some policies that other companies have tried include:

  • Offering information about data collection and protection in a simple SMS to consumers, with more details easily accessible by a phone line or alternative source.
  • Clearly stating and enforcing internal policies to make sure only authorized users have access to data and that employees understand privacy and security practices.
  • Allowing consumers to view their own data so they can better understand what is collected about them and ensure accuracy of information.

Laura Caron is a Scholar in the School of Foreign Service at Georgetown University.  She studies International Political Economy, with a particular interest in development.

This post is part of a series to broaden investor-partner collaboration and harness ongoing experiences from co-founding, current and prospective Signatories of the Investor Guidelines.