CFI Fellow Patrick Traynor, Associate Professor in the Department of Computer and Information Science and Engineering at the University of Florida, explains his research on the privacy and security of data in mobile lending applications.

We have all seen privacy policies before: sign up for a credit card and you receive a pamphlet with tiny print detailing your bank’s particular policy. Create an account at an online service and you will get a link to something similar from it, too.  These policies are supposed to provide consumers with detailed information about which pieces of their data will be stored, how they might be used, with whom they can be shared, and how they will be protected. Privacy policies are now mandatory for financial institutions in developed nations, and here in the United States we are provided protection by laws such as the “Gramm-Leach-Bliley Act” (also known as the Financial Services Modernization Act of 1999).

Unfortunately, the reality of such policies is often not so clear. Many of these policies are written by attorneys with the sole intention of being consumed later on by other attorneys. That means that, in some cases, even highly educated individuals without a degree in law may not be able to fully understand what they are reading. What chance does the common consumer have to understand such policies?

You would think that consumers would be up in arms. But, let’s be honest – most people have never actually read these privacy policies, yet alone tried to understand them. Have you?

So then why is it important to examine the state of privacy policies?

Let me offer first an insight into the role of studies like ours and then some comments on why privacy policies for digital credit matter.

Protection by a Community of Experts

Outside of some minor fixes (e.g., replacing windshield wipers, changing the oil, etc.), I have never performed major repair work on my car. There are good reasons for this. First, modern cars contain literally thousands of moving parts and as many as 100 computer processors. I simply do not own the tools, nor do I possess the expertise necessary to diagnose and fix problems in a modern vehicle. On all but the rarest of occasions, I also have not read the owners’ manual. So how has my vehicle stayed in such excellent running condition for years?

Quite simply, we all rely on an army of experts so that we do not have to worry about the details of our cars’ operations. That army encompasses dealerships and independent mechanics, insurance regulators, and the National Transportation Safety Board (NTSB) in the United States, all working to ensure both the correct operation of all car components, and the safest possible failure modes (i.e., when things go wrong, manufacturers select options that protect passengers).

The same holds true for privacy policies. Like my car, the ideal state would be for such policies to be constructed so that everyone could easily understand them and make informed decisions based on their contents. But the reality is that, in their current state, it takes an army of experts to look through such policies to determine if they are reasonable or not.

Does that mean that privacy policies should be written without the consumer in mind? Absolutely not. Ultimately, we believe that privacy policies should be accessible to all – after all, the information they provide need not be as complicated as the inner workings of my car. The good news is that privacy policy readability has dramatically improved across industries over the past 20 years in the United States. The question remains whether the spate of online credit entities will build on these lessons, or once again repeat past mistakes. That’s one part of what our research project is trying to measure.

Importance of Sound and Transparent Privacy Policies and Security Practices

Privacy policies have become particularly salient for financial inclusion because of the promise held out by innovative lenders that creditworthiness assessments using data from new sources (e.g., historical location information, the applications running on the customer’s phone, spending habits on a mobile money account, etc.), can potentially open access to credit to ‘thin file’ customers lacking a traditional credit score. Thus, as electronic activities generate new data trails, it is important that this potentially powerful information be handled appropriately.

Many companies providing online credit decline to say which data they use or how they make credit decisions, arguing that such processes are their intellectual property and require protection against competitors. Such behavior carries a great potential risk to consumers. For instance, if the online credit provider is the victim of a data breach, consumers may have little idea about the extent to which their data (including their everyday behavior) may be exposed. Any insight into these behind-the-scenes decisions is therefore helpful to users trying to make informed data-related decisions.

A thorough privacy policy may also serve as an indirect indicator of an institution’s security practices. Because data storage is inexpensive, there exists great temptation to record as much data about customers as possible and keep it for the entire life of the business. However, such behavior dramatically increases the data security risks for consumers over time. Having a detailed privacy policy provides evidence that the online credit provider has thought specifically about what type of data is necessary to keep, and how it intends to store the data to minimize risks for its clients. Moreover, explicit policies about data retention, how and with whom such data can be shared, and methods for user choice (i.e., “opting out”) make clearer to customers (and experts alike) the protections a provider intentionally puts in place to guard information.

So, how clear are online credit providers about how they protect data? Do they typically discuss such information with their prospective clients? These are features we intend to measure in this project by conducting close analyses of privacy policies throughout the digital lending sector.

And with the steady influx of online credit providers in recent years, this research is increasingly pertinent. My University of Florida research team’s recent findings, encapsulated in our upcoming paper, entitled “Regulators, Mount Up! Analysis of Privacy Policies for Mobile Money Services” (to be published at the 2017 USENIX Symposium on Usable Privacy and Security (SOUPS)), shows that privacy policies are woefully inadequate in the mobile money space. Examples include policies as short as 68 words and documents not written in the language spoken by those they are intended to serve. Clearly, there is work to be done.

Regardless of our future findings in this space, I want to reiterate our research team’s core belief in the transformative power of online credit. We believe that such beneficial services can and should be delivered with security and privacy considerations that are at least as good as those in the traditional financial space.

The final report of this research will go into detail about the findings in this space, together with a list of policy recommendations on how to improve privacy policies. Learn more here.

Image credit: Accion

Have you read?

How Secure Is Data Used in Digital Credit?

Data Are Not Neutral (Part 1 of 2)

10 Things I’ve Learned from the CFI Fellows Program