Why Predictive Analytics Needs Due Process
Researchers suggest a new framework to regulate the “fairness” of analytics processes. What could this mean for your organization?
Topics
Competing With Data & Analytics
There are certain insights online retailers are able to derive about their customers using data and analytics. Now, brick-and-mortar stores are able to get similar insights — whether someone who walked by their store came in, how long they stayed, where they went inside the store — even by what path. Predictive analytics can be applied to determine what resources a store might need, optimal layouts, or what shoppers might be likely to purchase.
One such analytics provider, Euclid, has about 100 customers, including Nordstrom and Home Depot, and has already tracked about 50 million devices (your smartphone and mine) in 4,000 locations, according to The New York Times.
Euclid, which has a prominent “Privacy” tab on its homepage, supplies retailers with reports culled from aggregated, anonymized data. While it does abstract data from cell phones that could identify an individual, it doesn’t actually use this data to pinpoint individuals (yet). To drive this point home — and to reassure, one would assume, jittery retailers worried about creeping out customers with Minority Report-like predictive technology — Euclid has worked with the Future of Privacy Forum to develop the Mobile Location Analytics Code of Conduct, a self-regulatory framework for consumer notification.
But, say researchers, these types of frameworks — not unlike like the White House’s Consumer Privacy Bill of Rights — do not go far enough to protect individuals from the potential harm of predictive algorithms in an era of big data, particularly as their use expands beyond retail — law enforcement, health care, insurance, finance, human resources.
In a recent research paper, authors Kate Crawford, principal researcher at Microsoft Research, a visiting professor at the MIT Center for Civic Media and senior fellow at NYU Information Law Institute, along with Jason Schultz, associate professor of clinical law at NYU School of Law, are proposing a new method to harness predictive privacy harm: Procedural data due process that determines, legally, the fairness of an algorithm.
Comments (2)
Marie Wallace
Marie Wallace