Guide to Building Data-Driven Organizations in the Public Sector

Best Practices in Privacy

Team 4: Martha Ramos, Philip Schlotter, Randolph Wilkins

Overview

Chapter Summaries

Privacy. What Will a Data-Rich Future Look Like?(Philip Schlotter)

Pentland’s chapter about privacy discusses potential problems with a data-rich future. Now that we are generating more and more data every day, who will be the keepers of the data? How will that data be used? Can the private sector be trusted with big data? Can the government be trusted with big data?

Pentland professes that there should be a new process for managing data and he and his team are working on creating that “New Deal on Data” (Pentland, 2015). Pentland lays out a process of data ownership, similar to owning real property, where the data owners will have the right to possess, the right of full control of use and the right to dispose of or distribute that data. The data keepers will be responsible for holding and managing your data, similar to how a bank or credit union holds your financial instruments. If the consumer is unhappy with how their data is being managed, they may have the option to remove their data from that data keeper and place it with another. In theory, data keepers will maintain or grow their business by keeping their customer satisfaction scores above a minimum level.

Pentland also discusses the challenges of a data-rich system. Because the data points grow exponentially, conventional analysis tools are not sufficient to cope with seemingly infinite variables. Pentland is working on a map tool similar to Google Maps but instead of locations and traffic, the new map will track poverty, crime rates, and other social indicators.

Pentland states that one of his goals is “…to develop new ways of sharing data to promote greater civic engagement and exploration.”( (Pentland, 2015) The collected data can be utilized for self-improvement or once grouped for improvement of a community.

Personal data have been called the oil of the new economy. (Pentland, 2015) This personal data that we give away for the use of the most common applications is what allows smart phones and other services to personalize our experience. Did you search for a new mattress on your electronic device and then a few days later, an advertisement is in your news feed on your computer. Coincidence? No. As an incentive to use that cool new application on your phone for “free”, you allowed the application to learn a few things about you. You may be ok with getting some advertising for new mattresses or new clothing but what happens when you search for medical terms? What if you have a serious disease that you are scared about? Would you still want a treatment or medication to be presented to you based on you search history and the manufacture’s purchase of that data?

Pentland recommends allowing an Open PDS (personal data store) that will anonymize your data but still allow researchers or retailers to use it as a group but without the ability to track the personal data back to the person. I had to think about this for a bit to get a better understanding of what Pentland was describing. I am thinking of this process similar to Apple Pay. I can make a purchase with my credit card and have the potential for a person to capture my credit card number and possibly use the data for another purpose or sell the data. Enter Apple Pay. Now, I can waive my phone near the credit card terminal and Apply Pay removed my real credit card number from the transaction and replaces it with another identifier for that transaction only. If a hacker captured that data, he or she could not use of sell that data to anyone because it would not work again. That anonymizing process is similar to how Pentland explains his Open PDS. In Appendix 2, OPEN PDS, Pentland does mention that the possibility reidentification or deanonymization of mobility data sets using only four spatiotemporal points. That one section makes me think about the security of the entire process. I think more work is needed.

Using Personal Data in a Privacy-Sensitive Way to Make a Person’s Life Easier and Healthier (Ramos)

This chapter brought to light the many ways that data collection can be used to help better the lives of people. The chapter explored the areas of technology, software, and the development of algorithms to assist in providing a healthier and safer environment for people. The technology described included monitoring systems, phones, and other tracking devices that have been used to help people break habits such as smoking, assist in eldercare through recognition of habits, and antitheft car tracking systems. This chapter discusses the Privacy Act of 1986 that protects individuals from access to their personal data. It also explores three systems where the “safeguard of data could be critical such as in health-wellness incentive programs, pay-as-you-drive auto insurance, and prevention of domestic violence and caretaker abuse” (Eagle & Greene, 2014).

As we continue to see higher insurance premiums, many companies are beginning to utilize health incentive programs to minimize the number of claims that employees to reduce the cost of insurance premiums. The programs are either provided in one of two ways either through program participation such as smoking cessation for financial incentives to the employee or through the use of sensors such as pedometers, smart phone applications, or by the employee logging in data such as hours of weekly activity, preventative care, or work sponsored challenges. According to Eagle & Greene, the participation rate for these programs is close to 80%. As medical costs continue to rise and more sensors are created with tighter algorithms, considerations to federal law requirements is necessary because they protect employees’ rights as they relate to their private health. The 2008 amendment to the ADA Act, bans employers from asking medical questions unless they are job related, and the 2002 EEOC requires that health programs stay voluntary and in line with the ADA Act of 2008, and finally, the 1996 HIPPA regulation allows employers to provide varied premiums, deductibles, and copays as part of the employer wellness program.

Another area that could compromise an individual’s privacy is found in the pay-as-you-drive auto insurance options. Many states and countries have adopted this practice; however, the state of California “pushed PAYD insurance legislation in 2009.” Apart from reducing insurance premiums for some drivers, the legislation also provided incentives for people to drive less to protect the environment from harmful vehicle emissions. Initially PAYD was based on solely odometer reading but the Electronic Frontier Foundation asked that a bill be passed to include readings from electronic monitors to avoid fraud. Therefore, as algorithms improve and insurance companies can track what neighborhoods you drive in, your driving habits, how much you drive, etc. there is the fear of individuals data being compromised and shared.

A final consideration is the use of installing tracking devices on those who are vulnerable such as elderly with mental or physical limitations, children, and teens. The use of tracking systems on elderly is common as they are especially used for Alzheimer patients. These tracking systems can also be installed in a vehicle without anyone’s knowledge. However, there is a rise in reported abuse from caretakers of elderly persons. In these situations, the use of a tracking device would not allow a victim to escape because the tracking device would alert the abuser of their whereabouts. In one instance, an individual “sued FoxTrak Vehicle Tracking for aiding and abetting her abuser in an attack” (Eagle & Greene, 2014). This is a strong example of how data can be used negatively if place in the hands of criminals.

This chapter provided a strong overview of the technology that is now available that can make a person’s life easier and healthier. However, there are factors to consider such as the negative effects that the data could produce if placed in the hands of people who want to use it to exhort or abuse. Consideration for the protection of individual data should be looked at carefully by lawmakers to help reduce the anxiety and potential negative impacts on a person’s life.

Discussion Questions (for Yellowdig)

  1. Would you have faith that private sector or government sector would gather your data and use it for a noble purpose?
  2. Would you want to be compensated for the use of your private data by private sector organizations?
  3. Would you be more willing to give your data to the government for free if it was for the betterment of the community?
  4. How do you feel about tracking devices placed on people with Alzheimers? What are the pros and cons of the use of technology to track this data?
  5. What are your thoughts regarding the need for more legislation as it relates to the protection of an individual’s data? Do you believe more protection is needed?

References

  1. Pentland, A. (2015). Social Physics: How social networks can make us smarter. Penguin. CH10 data-driven societies, Appendix 2 openPDS
  2. Eagle, N., & Greene, K. (2014). Reality mining: Using big data to engineer a better world. MIT Press. CH2 using personal data in a privacy-sensitive way
  3. Everything you need to know about the EU’s new privacy law GDPR: https://www.theverge.com/2018/3/28/17172548/gdpr-compliance-requirements-privacy-notice