Learn about data lakes, machine learning & more innovations

  • May 19, 2015

Big Data and the Future of Privacy - paper review (Part 2 of 3)

Wes Prichard
Senior Director Industry Solution Architecture

This is part 2 of a review of a paper titled Big Data and the Future of Privacy from the Washington University in St. Louis School of Law where the authors assert the importance of privacy in a data-driven future and suggest some of the legal and ethical principles that need to be built into that future.

Authors Richards and King identify four values that privacy rules should protect that I will summarize here from my own perspective.


Identities are more than government ID numbers and credit car accounts. Social Security numbers and credit cards can be stolen, and while inconvenient and even financially damaging, loss of those doesn't change who we are. However when companies use data to learn more and more about us without limit, they can cross the boundaries we erect to control our own identity. When Amazon and Netflix give you recommendations for books, music, and movies, are they adapting to your identity or are they also influencing it? If targeted marketing becomes so pervasive that we live in bubbles where we only hear messages driven by our big data profiles, then is our self-determination being manipulated? As the authors state, "Privacy of varying sorts - protection from surveillance or interference - is what enables us to define our identities." This raises the question of whether there is an ethical limit to personal data collection and if so, where is that limit?


Knowledge is power and data provides knowledge. Knowledge resulting from data collection can be used to influence and even control. Personal data allows sorting of people and sorting is on the spectrum with profiling and discrimination. One possible usage of data-driving sorting is price discrimination. Micro-segmented customer profiles potentially allow companies to charge more to those that are willing to pay more because they can identify that market segment. Another ominous usage of big data is to get around discrimination laws. A lender might never ask your race on a loan application but it might be able figure out your race from other data points that it has access to. We must be careful that usage of big data does not undermine our progress towards equality.


As pointed out earlier, the sharing of personal data does not necessarily remove an expectation of privacy. Personal privacy requires security by those that hold data in confidence. We provide personal information to our medical providers, banks, and insurance companies but we also expect them to protect that data from disclosures that we don't authorize. Data collectors are obligated to secure the data they posses with multiple layers of protection that guard data from both internal and external attack.


Privacy promotes trust. When individuals are confident their information is protected and will not be misused, they are more apt to share. Consider doctor/patient confidentiality and attorney/client privilege. These protections promote trust that enable an effective relationship between the two parties. Conversely, when companies obtain information under one set of rules and then use it in another way by combining it with other data in ways the consumer did not expect, it diminishes trust. Trust is earned through transparency and integrity.

See Part 3 for a three-pronged approach to protecting privacy.

Link to Part 1

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.