Data has exploded throughout every facet of our world, giving organizations and companies more access to personal information than ever before.
While that information has proven to be beneficial in countless ways, it can also be taken advantage of. Consumers are entrusting companies with more information than they even know, so it is up to those organizations to treat data with the respect and privacy it deserves.
It’s time to talk about the integrity of data and ask, is the usage of this data moral?
A Step in the Right Direction
The EU has already started to develop a response to regulate big data with the General Data Protection Regulation (GDPR). GDPR is focused on issues of data privacy and the rights of the consumer. As the growing processing power of the latest technology continues to transform data into an ever-more valuable resource, the question of who owns data is becoming an increasingly thorny and complex one.
Today, we see an ever-growing list of use cases where the issue of data ownership is a very important one but also one where the complexities are such that it is very difficult to reach a conclusive answer.
Consider this example. A few years back, Target built an analytical model for predicting pregnancy. A data scientist identified 25 products that, when purchased together, indicated a woman might be pregnant. From a business perspective, this is great information for a company to have on hand. It means Target could send personalized promotions and transform this likely-pregnant individual into a solid customer for years.
But, as Target learned the hard way, this use of big data could also lead to inadvertent exposure of private information. A man walked into a Target store furiously clutching coupons that had been mailed to his teenage daughter, congratulating her on her pregnancy and offering discounts on diapers. She actually was pregnant, and that was unwelcome news to this grandfather-to-be. It was a public relations disaster for Target, and it raised serious questions about whether that data was theirs to use.
In addition to the question of data ownership, is the question of responsibility of the data you own. How do you ethically analyze sensitive data? Let’s discuss other examples.
The Preexisting Condition Conundrum
Amid all the bickering and political backstabbing going on in the U.S. Congress today over healthcare, an interesting new question has been ignored: What should be done about preexisting health conditions that are only predicted? For example, what happens if an insurance company uses big data to build a predictive analytics model that determines a customer is likely to develop an illness or suffer a catastrophic event, such as cardiac arrest, and then uses that information to deny coverage? The discussion today only focuses on pre-existing conditions we already know about, but as we look forward, should companies have the power to preemptively decide a patient’s coverage?
Organ Donation Analyzed
Let’s consider something on the positive side of the spectrum: Using analytics to increase the effectiveness of organ donation. The United Network for Organ Sharing (UNOS), a Talend customer, is using data and algorithms to optimize the matching of patients with transplantable organs. The use of analytics allows physicians to match the history of the organ and other vital data about the organ with the history and vital data about patients so that they can make a better decision. In this case, the “data virtue” question is not about whether or not to use the technology — it’s how to best expand it to other areas of patient care.
We’re already starting to see a formal government response to the societal backlash of data virtue with GDPR, and as the scope of what big data can unlock enters its renaissance over the next couple years, this topic is going to become even more critical. At the end of the day we must remember customers are providing this information, sometimes very personal information for something in return, and when that value doesn’t match up, we have a serious problem.