Posted on October 28, 2013 at 7:40 PM
by Craig Klugman, Ph.D.
Everyday you perform web browser searches, pay for items with a credit card, visit web sites, pass through automated toll lanes, use your cell phone, navigate via GPS, read news online, renew your driver’s license, sign up for insurance, and much more. Each interaction leaves a trail of where you have been, what you have done, and in what you are interested.
The result of all this information is a new field that is being called “Big Data.” According to Lisa Arthur at Forbes, “Big data is a collection of data from traditional and digital sources inside and outside your company that represents a source for ongoing discovery and analysis.” Companies collect and value this data. According to the Wall Street Journal, this information is used to “enhance customer experience,” “process efficiency,” “development of new products or business models,” “more targeted marketing,” and “cost reduction.” Or in short, to find ways to make more money flow from your pocket into theirs.
McKinsey & Co reported in 2011 that big data was the next opportunity and risk for business. They stated that big data would change the way we do health care, public government, retail, manufacturing, and “personal-location data.” This report stated a need for expertise in “deep analytical skills” to take advantage of all this information.
A recent Wall Street Journal report echoed the need for training people to analyze big data and for leaders to be able to make sense of and use the data. For example, in 2009, the Centers for Disease Control & Prevention started tracking physician prescriptions for antivirals and over-the-counter purchases to see where there may be flu outbreaks. This method is more accurate than the previous one—reviewing insurance claims.
Some pundits think that “big data” is nothing new, that companies have always collected information about clients and customers. The new thing is that it is so much easier to collect information now, to correlate it among sources, and that we can store a lot more of it. Others hold that Big Brother has finally shown up, 30 years later than predicted: No aspect of your life is not watched, collected, stored, and made available for analysis.
In the world of ethics, big data holds a host of questions and controversial issues. McKinsey & Co felt that several issues would have to be addressed before the power of big data could be harnessed including “privacy, security, intellectual property, and even liability.” Much of the “ethics” concerns seem to equate ethics with “legal issues.” The Wall Street Journal report adds risks of “financial exposure,” seeing patterns where none exist, and wasting money “chasing poorly defined problems or opportunities.”
A year ago I was sitting in a classroom during my reunion at Stanford University. At the front of the lecture hall was the director of a civil liberties group, a legal expert from a large internet company, and a law professor talking about, you guessed it, Big Data. What the conversation concluded was that yes, lots of data is being collected, but that tools did not exist yet as to how to make sense of the information. In other words, you probably couldn’t answer many questions with all of the information, nor could you even formulate what questions we should be asking. However, they cautioned, lots of companies were working to find ways to overcome those problems.
With business journalists, pundits, and others looking at Big Data, it’s possibilities and dangers cannot be ignored. And universities are listening. Over 49 schools have released 57 masters programs in big data. My own university is looking at no less than 3 masters programs with different emphases in business, statistics, and computing. IBM created its own curriculum and Forbes has crafted the term “Data scientist.” Education administrators are working with businesses to see how they can provide educational experiences through degrees and certificates to meet the needs of the workforce. (The issue of whether education should be about how to do specific jobs rather than engaging intellects is a conversation for elsewhere.)
In the world of health and health care, the stakes for privacy, confidentiality, and potential harm to patients are even larger. While this revolution in business is going forward and new programs are developing that teach people how to collect, analyze, and manage the world of big data, there is a disconnect. The programs look at how to do big data, not whether we should do big data. As a society should we reduce every thought and action to a data point to be analyzed for marketability? Many programs do not connect the action to the context forgetting that these data points are human beings and that looking at someone’s widget purchases is different than tracking their prescriptions and visits to the doctor. Analysts and programmers need to understand the context of the people at the end of those data points and the real-life implications for misuse of data and its effects on real people. We need to look at data with the aim of improving the lives of people (not just corporate bottom lines).
The big data train has left the station, so now its up to us ethics educators to craft programs that will give analysts and managers the tools to think critically and ethically about their work. One such effort is the Nuffield Council on Bioethics new investigation into big data in medicine.
And as a human-being-in-the-world, it is up to all of us to decide how much is too much. For example, I avoid car insurance companies that offer a tracking device for my car (although new proposals would require such devices for purposes of taxing us for miles driven). On an everyday basis, many of us hold our heads in the sand—ignoring what is being collected or saying “there’s nothing I can do, privacy is dead.” But in reality: I vote. I shop, I write. Together, we could do a lot.