maize mpub9970368 in

    PERSONAL DATA MANAGEMENT

    1. Your presence on social media

    Today we connect ourselves and share information on multiple social media platforms. Have you ever thought about who sees this information? Sure your friends and family can see what you have been up to, but what about prospective colleges and employers? Your digital footprints are everywhere and anyone (including colleges) can know just about anything about you. Sometimes we may share too much information — and sometimes we don’t know who is collecting it. Our social media accounts project an image about who we are and then people make judgements about us. Sometimes we are not in control of how that information is shared by others. Think about your presence on social media and how it can affect what people think about you.

    Resources

    Discussion questions

    1. What is social media? List different social media platforms.
    2. What are the benefits of engaging in social media? What are the drawbacks?
    3. Do you have a digital footprint on social media? If so, list the social media platforms that you engage in.
    4. What is the difference between and active and passive digital footprints? Can you give examples of each? Do you know who is actively collecting data from you? Do you know who is passively collecting data from you? Make a list for each type of data collection.
    5. Consider your digital profile on the various social media platforms you engage in. How would prospective colleges and employers (correctly or incorrectly) make judgments about you based on only the limited information they have from social media? Do you think your digital profile on social media will hurt or help your college and career goals? Why or why not?
    6. Do you think it is important to limit your digital footprint? Why or why not?
    7. What can you do to “clean-up” your social media profile? Should you?
    8. What steps can you take to protect your digital information?
    9. What would you have to do to get off the electronic grid in order to not digitally share any information? Is it worth your time and effort? Why or why not? Consider this image – http://imgur.com/gallery/zebhR .

    2. Tracking student physical activity in school

    Fitness tracking has become a standard part of any American initiative to improve physical fitness. Fitness tracking is now synonymous with wearables like Fitbits and Apple Watches. These devices have the capability to track fitness information, location and other health information. Oral Roberts University opened in 1965 with a rare fitness requirement for incoming students. In January 2016, the school announced it would require incoming freshmen to purchase and use a Fitbit, with a steps per week and average heart rate requirement. This data is stored in a secure database, to which school officials, including professors, have access. This sheet is intended to briefly inform about this issue, present some background resources, and provide sample discussion questions for students.

    Resources

    Discussion questions

    1. Does this case surprise you? Why or why not?
    2. Would this requirement be reasonable if it were an elementary school? Middle school? High school? How would you feel if your school had this requirement?
    3. Part of Oral Roberts University’s mission is to educate the whole student, and students enroll with full knowledge of the fitness and Fitbit requirement. Is the Fitbit requirement reasonable or unreasonable? Before Fitbits, students logged their activity in a journal. Does this change your answer?
    4. The University would be able to track student location, as well as fitness information, even while students are off campus or school is not in session. Does this change your opinion? The school has made clear that students are only required to log steps and heart rate information. Does this change your answer? While students are only required to log those two, they would have to opt out of location tracking on the models with that feature. How does this affect your response?
    5. Should all professors have access to the data from the Fitbits? Why or why not?
    6. It is probably only a matter of time until Oral Roberts’ database is hacked. Does this affect your answer?
    7. Oral Roberts University was already one of the healthiest universities in the United States before the Fitbit initiative. Given this information, do you think the Fitbit plan is necessary? Why or why not?

    3. Amazon Echo Look

    When you are choosing an outfit to wear do you like getting feedback from other people? Do you wish someone would help you find the kinds of clothes you like faster? And that they fit perfectly? All these benefits are promised by Amazon’s Echo Look, a hands-free, voice-activated, Internet-connected device with a camera and two-way speakers. It takes pictures of you wearing different outfits and makes an automated judgement on which is the best. Amazon hopes to use the images users take to better understand the clothing they like, know what looks best on them, and to eventually be able to make clothing fit specifically to an individual, on demand. Consumer advocates worry that the wide range of information recorded by the Look could be harmful to users.

    Resources

    Discussion questions

    1. Is this a device you would like to own? Why or why not?
    2. What are the advantages of this $200 tool for people? What are the disadvantages? How might this tool help you feel more ready to face the world?
    3. According to the promotional photos, for whom is this device intended? Women only? Women and men? Does the target audience change how you feel about the device?
    4. What might you give up for this device to work? How might you be rewarded for giving up that data?
    5. Whose standards determine whether you look good? You might think of how the system could privilege a certain cultural, ethnic, or stylistic group or even the “look” of particular designers. How will you know what data was used to train the algorithm into passing judgment on your fashion sense? Whose standards would you want judging you?
    6. What else gets recorded, as Amazon Look is on all the time? In the promotional photos, Look is placed in the bedroom. Does that make this device different from other automated home devices like the Amazon Echo, Apple HomePod, or Google Home, which are more likely to be placed in a home’s more public spaces?
    7. What could a company learn about you merely by “seeing” what you are wearing? In other words, what else might the camera be able to “see”?
    8. Professor Tufecki’s Twitter thread begins by unpacking her concerns about the Amazon Look but then connects those issues to other, broader issues about privacy. To what degree do you agree with her overall line of thinking? Does any part of her thread bring up new reactions or thoughts about privacy or protecting personal data?
    9. Some machine learning projects that are being designed to auto-identify the objects in the photo do a better job identifying Caucasian faces as humans than African-American faces. What would the implications be if Amazon Look is better at working with one skin type than another?
    10. How might you feel if the system were to tell you your favorite outfit doesn’t make the grade?
    11. What are the potential benefits/risks of this tool on someone’s self-esteem? Does the fact that the marketing materials show the Look being used only by females color your answer?

    4. Smart home devices in court

    New smart home devices like the Amazon Echo, Apple HomePod, and Google Home work by listening to what you say and transmitting that data to a third party. They are only actively recording and transmitting when a keyword is used. However, they are always passively listening for that keyword. This means there is the possibility of audio recordings of illegal activity not obtained by the police and without the explicit consent of the person being recorded which could be useful for a criminal investigation. This issue was recently brought to a head by a murder case in Arkansas where Amazon refused a warrant for the recordings of the Echo sitting on the suspect’s counter. This sheet is intended to briefly inform about this issue, present some background resources, and provide example discussion questions for students.

    Resources

    Discussion questions

    1. Does this case surprise you, or were you familiar with it?
    2. Echos only begin “listening” when a keyword, also called a wake word, like “Alexa” or “Echo” is used. With this information, do you think there would be anything useful on an Echo in most cases? However, the voice recognition technology is not perfect and there are plenty of stories of Echos activating without the wake word. Does this change your answer?
    3. Should people expect to have private conversations around smart home devices like the Echo?
    4. Is data recorded and stored by automated devices like Echos different from a voicemail or electronic document created by the same person? Why?
    5. Police discovered from another device in the Arkansas home that there was an unusual level of water use around the time of the murder. Should data recorded by smart home devices be covered by the heightened legal expectation of privacy in the home?
    6. In 2014, the FBI requested Apple break into a user’s iPhone during the investigation of the San Bernardino shooting. How is this case different? The same?
    7. Does this case change how you feel about the Echo? How?
    8. Sometimes, phones are banned in schools, but smartwatches with voice activation (like Apple Watch’s Siri) are increasingly common. Does this kind of technology concern or excite you? Does this change the kinds of things you would say in school (whether you are an educator or a student)? Support your argument with evidence.

    5. DNA mapping

    Over the past several years, the cost of mapping an individual’s DNA has dropped dramatically. Now, many companies offer genomic sequencing services for as little as $99. Some consumers participate to find out where their ancestors come from, others to check if they have an increased likelihood of genetic diseases. Scientists argue that massive databases of wholly-sequenced individuals will allow them to compare and figure out what gene combinations indicate complex genetic diseases. They also tout the promise of personalized medicine, matching specific medicines and other cures to an individual’s specific genetic makeup. But other scientists argue that the science is not yet precise; privacy advocates worry about the implications of having access to such in-depth genetic data on individuals. What is the best choice for humanity as a whole?

    Resources

    Discussion questions

    1. How does the privacy risk of DNA testing weigh against the benefits from having large-scale databases?
    2. What do you think are the most compelling risks of donating your DNA for testing? What do you see as the most compelling rewards?
    3. Who should be deciding what happens with an individual’s DNA: The individual? Scientists? Service providers? The government? What factors did you consider in forming your response?
    4. Currently, the cost of genetic testing for health purposes can still be quite high, especially since most people need specially trained medical professionals to help them understand their results. If health insurance companies covered the cost of this testing and follow-up consultations, what would some of the possible implications be for consumers? Would it affect everyone the same way? If not, what might some of the differences be?
    5. What would the impact be if those with insurance were receiving genetic testing and those without insurance were not? What could the possible long-term consequences be? What do you think about those potential consequences?
    6. Visit the Harvard site at https://aboutmyinfo.org/index.html . Test how easy or hard it is to identify you from some simple personal facts. What do you think the point of this site is? What do you conclude from having entered the data? How hard would it be to identify you from this data?

    6. When insurance gives you a fitness tracker

    Health insurers have begun using activity data collected from wearable devices to provide either cash incentives or deductible rebates to active individuals. With a goal of long-term cost savings through overall health improvement, the attraction for insurers (and the employers that underwrite so many Americans’ health coverage) is clear, but overall the use of 24/7 wearable technologies has privacy implications. Wearables have the potential to provide justification for denying coverage to individuals deemed inactive or unhealthy, or increasing insurance rates based on information collected. The Denver Post reports that employers are expected to incorporate more than 13 million devices into wellness programs by 2018. Currently, the emphasis is on wearables to increase awareness of the level of activity and health indicators, but is it a slippery slope from incentivizing healthy behavior to requiring it?

    Resources

    Discussion questions

    1. Do you wear a fitness tracker? If so, what motivated you to purchase it? How closely do you monitor your data and make decisions based on what the tracker tells you?
    2. Many corporate health strategies require intermittent monitoring, but insurers concerned with overall wellness have the potential to require round-the-clock wearable use. Do you feel corporations should be able to mandate activity outside the workplace? If you forget to charge your wearable, should you be penalized? What about other personal health decisions: should an employer have the right to ask you to put on a seat belt? Not smoke? Not to drink alcoholic beverages? When and where does an employer have the “right” to control after-hours behaviors? Where would you draw the line in your own life?
    3. As the capabilities of wearable health devices improve and increasing quantities of customer data is accumulated, companies will likely gain the potential to diagnose illnesses through improved sensors and big data analysis. How could insurance companies deny coverage or set rates based on preexisting conditions as collected by wearables?
    4. Could employees have granular control over the types of information, how long it is stored, and whether it is portable between insurers? Does it matter if you can opt in and out of individual tracking features?
    5. Do you think employers should be able to access raw fitness data? Should they be allowed to gamify workplace wellness by comparing employee metrics to recognize and promote healthy behaviors? Why or why not?
    6. Some auto insurers encourage drivers to install monitoring devices in their vehicles in exchange for a discount. (For an overview of this, see https://cars.usnews.com/cars-trucks/best-cars-blog/2016/10/how-do-those-car-insurance-tracking-devices-work .) These devices are known formally as telematics-based tracking, and insurance companies often promote them to potential customers by saying that safe driving deserves a reward in the form of insurance discounts. Once installed, telematics devices tap into a vehicle’s internal computer to capture data about speed, distance, time spent driving, braking actions, and/or location. Is providing health information different from driving information? How? What kind of security or privacy do you expect your driving data to have?

    7. Hiding from digital marketing

    After realizing Internet corporations had intuited her engagement from online interactions, sociologist Janet Vertesi decided to treat her pregnancy as an experiment, strategically using encryption and a variety of anonymizing techniques to mask her consumer participation in the process of preparing for her baby’s birth. Vertesi discussed the difficulty, expense, and resulting anxiety of her decision to avoid revealing her pregnancy to marketers, revealing much about the ubiquity of online tracking and targeted advertising we experience both online and offline.

    Resources

    Discussion questions

    1. Many of the techniques that Vertesi adopted, from using Tor to encrypt online searching to buying gift cards and using cash rather than account-lined payment methods, are associated with criminal behavior rather than privacy. What are some other possible reasons for desiring privacy?
    2. Part of Vertesi’s experiment involved using gift cards rather than personal credit cards and having online purchases delivered to a locker rather than her home. She noted that many of her techniques to avoid notice were expensive. Does this suggest privacy has the potential to be a luxury good?
    3. Vertesi found that data about pregnant women has a value of $1.50 versus ten cents for the average individual. What other groups might be similarly attractive for online marketers? How would they be identified from online behaviors?
    4. Vertesi describes a reluctance to engage in social media for fear of inadvertently triggering a conversation about her pregnancy. Were you surprised that social media was a major source of information identifying mothers-to-be?
    5. Vertesi went to great lengths to hide a particular and time-limited personal event. Why or why not would such tactics be tenable over the long term? What other life events might inspire a similar need for privacy?

    8. ISP consumer data collection

    In 2015, the Federal Communications Commission (FCC) ruled that Internet Service Providers (ISPs) were utilities, and therefore under the purview of the FCC rather than the Federal Trade Commission (FTC). In October 2016, the FCC passed privacy regulations banning ISPs from sharing or selling data from consumers without express consent. Congress passed a bill overturning those regulations and preventing the FCC from creating “substantially similar” regulations in the future. On April 3, 2017, President Trump signed the bill into law.

    Resources

    Discussion questions

    1. What problem is this law trying to solve? Who benefits? Who does not? What motivated politicians to believe this was a valuable act to benefit the country? Did your elected officials vote for this law? What do you make of their decision?
    2. Does this ruling make you want to change your Internet browsing habits? Why or why not? Even without these rules, ISPs are mostly only able to see which websites you visit, not what you do on them. Does this change your opinion?
    3. How are these regulations more and less helpful for consumers? Companies? In general?
    4. The CEI article argues that the Wiretap Act, which disallows intercepting the contents of electronic communications without consent, also bars ISPs from monitoring metadata, like the websites you visit. Do you agree with this argument? Why or why not?
    5. Some ISPs have policies that allow you to opt-out of data collection and sharing, but they are mostly difficult to find and sign up for. Is it worth most consumers’ time to opt-out? Why or why not?
    6. Can you think of time in history when citizens have shared information that seemed innocuous at the time but ends up having a devastating effect? For example, some people say the data on personal religion gathered by the German government years before Hitler rose to power made it easier to identify Jews later. Do you find this historical reference hyperbolic or prescient?
    7. The regulations that were overturned had not yet taken effect. Do you think that overturning these regulations will change how ISPs handle consumer data?
    8. The bill was first proposed on March 7, and sent before the Senate on March 15, spending only 8 days up for debate in committee. How do you think this affected the contents of the law?
    9. The regulations were overturned using the Congressional Review Act, which allows Congress to pass a resolution of disapproval of a regulation. Congress has passed thirteen such resolutions during the Trump presidency so far, compared to five during the Obama presidency, and just one during the Bush presidency. How does this affect your opinion of Congress and of this law?

    9. Encrypted data, privacy, and government access

    From ancient times, people have used codes, ciphers, wax seals, and other means to disguise information, ensure that no one but the sender has read a document in transit, or verify someone’s identity. Today, encryption is used to enable secure communications over the Internet in areas such as online banking and commerce, email, and more. The goal is to protect private information so even if it is intercepted, it will have little value to others.

    In addition to private companies’ use of encryption software to protect their devices and the data they transmit, millions of individual users of email and messaging services opt to use end-to-end encryption to protect their personal communications from hackers. But some lawmakers argue that encryption software should include an access key for law enforcement.

    Consider the sources below to better understand the benefits, risks, and potential legal concerns related to using end-to-end encryption for your digital communications.

    Resources

    Discussion questions

    1. Watch the YouTube video from the Wall Street Journal to get a refresher on how end-to-end encryption works. What is encryption? What is its value to society? What are the challenges it poses for law enforcement? In what ways are these challenges difficult to address?
    2. Some feel their messages have no sensitive content and so do not require encryption. Marlinspike’s blog post above quotes a law professor and a Supreme Court justice to argue that the complexity of U.S. law makes it nearly impossible to know when one may be in violation of a law or potentially implicated in criminal proceedings. Do you find this or any of Marlinspike’s arguments convincing? What, if any, reasons do you see for encrypting personal messages, regardless of their content?
    3. In the Wired profile of Marlinspike, he is quoted as saying, “I think it should actually be possible to break the law.” Part of his argument holds that positive social change requires experimentation with practices outside current norms. Are there other ways for radical ideas to develop into something that could be valuable to mainstream society? Have you ever known of a case in which someone broke a law in a way you thought was justified?
    4. While encryption can offer privacy to ordinary citizens, it also offers cover to criminal activity. The Burr-Feinstein bill is intended to require that private companies and individuals comply with court orders to decrypt data for law enforcement purposes. However, much encryption software is developed outside of the U.S., so our laws may not be sufficient to keep encryption out of the hands of criminals. Would you argue that there are still reasons to mandate backdoor access to encrypted devices? Consider the Pfefferkorn source. In your opinion, what are the strongest arguments for or against establishing backdoor access for law enforcement?
    5. In the Center for Democracy and Technology (CDT) reading above, CDT rejects the proposal before Congress to establish a National Commission on Security and Technology Challenges. Take a look at the concerns they raise and the recommendations they present. This is a nuanced issue about which much of the public is not well-informed. Imagine you were a reporter crafting a story on this topic. What are some key elements raised in the CDT statement that you would include in your report aiming to better inform the public?
    6. The Niskanen Center and the CDT, both organizations that advocate for civil liberties, have distinctly different takes on the proposal to establish a commission to address technology, privacy, and law enforcement, known as the McCaul Commission. Compare and contrast the sources from Niskanen and the CDT. How would you characterize their differences? Do you find one more convincing? Why?

    10. Protecting your rights through civic engagement

    It can be difficult to feel in control when you think about all the ways data is collected and used, both with and without your knowledge or direct consent. In addition to protecting your data through the choices you make about when and how to share it, you have the power to communicate your opinions about how information should be handled directly to your elected officials. Not only can you request that your officials vote for or against a proposed law, but constituents can recommend legislation they would like their officials to sponsor. If you do not believe it is the place of the government to make laws to protect privacy, you can communicate directly to the companies and other organizations that use your data, and use your power as a consumer to advocate for how you would like to see them use, or protect, your data.

    Resources

    Discussion questions

    1. Do you think that the government should play a role in protecting consumer data privacy? Why or why not?
    2. Do you think that industry self-regulation is an effective way for us to protect personal data? Why or why not?
    3. Consider what you have learned about privacy, controlling your personal information, and information ethics. Is there a particular issue about which you feel strongly, either in support of corporate rights to collect and use individual information or about rights of individuals to data privacy?
    4. Consider the audience of your letter: an elected official. What kinds of arguments would appeal most to the things that are important to the recipient of your letter? How might you explain why you feel strongly about this particular topic? Is there some element of your topic that you feel impacts you on a personal level? Why?
    5. To whom might you address your concerns? Is it a state-, federal-, or corporate-level concern? Is there a lawmaker, government agency, or business leader for whom your comments would be most pertinent?
    6. Today, there are so many ways to communicate. Discuss the advantages or disadvantages of these modes of communication: face-to-face visit, letter, contact on Twitter or Facebook; electronic submission via an elected official’s website, fax, or other.
    7. Federal elected officials keep offices in their local area as well as in Washington, D.C. Which point of contact might have more effectiveness: one call among many to the nation’s capitol or one call among fewer to a regional office?
    8. Try crafting a paragraph expressing your opinion to the pertinent recipient.

    11. What is a reasonable expectation of privacy?

    The Fourth Amendment to the Constitution protects citizens from “unreasonable search”, but as society changes, what is considered “unreasonable”, and as technology advances, is it clear what constitutes a “search”?

    Critical precedence was set by the Supreme Court in 1967 in the case of Katz v. the United States. Charles Katz was suspected of illegal gambling activities; federal agents attached a listening device on the outside of a public phone booth to capture evidence, but the Supreme Court threw out this evidence, ruling that Katz had a right to expect that his conversation was not being monitored. The agents would have needed to establish probable cause and obtain a warrant to legally record these conversations.

    Since 1967 the “reasonable expectation of privacy” has entered our national consciousness, but legally speaking, it’s a challenging standard to apply. Importantly, that right is forfeited with regards to information a citizen knowingly shares with a third party, such as an email service or a credit card company. The result is that it may be far from evident which of one’s communications are protected. Technological advances further muddy these waters, and new precedents are being set. In Jones v. United States (2012), the Supreme Court stated that attaching a GPS device to a car in order to track its movements constituted a search. And in 2014 the court established that a review of suspect’s smartphone is a search that requires a warrant, even after a suspect is arrested. (This is as opposed to other contents of an arrestee’s pockets, which can be searched once an arrest has been made.)

    Examine the sources below to consider: How do we establish what is a reasonable expectation of privacy? What new challenges to our definition of privacy are posed by technology? Why are rights to privacy important for our society, even if they may sometimes protect those who are breaking the law?

    Resources

    Discussion questions

    1. Freedom from unwarranted search and the privacy that provides were considered essential values and liberties by the framers of our Constitution. What connections do you see between legal protection of privacy and the basic principles which underlie our society? In our modern context in which so much is shared publicly, can we understand the value placed on privacy by our founders? Why or why not?
    2. One position on privacy holds that those who are not breaking the law have nothing to hide. Does privacy continue to have a valuable role in our lives today? Why or why not?
    3. In United States v. Jones (2012), Justice Sonia Sotomayor, writing a concurring opinion for the majority, wrote, “Awareness that the Government may be watching chills associational and expressive freedoms. And the Government’s unrestrained power to assemble data that reveal private aspects of identity is susceptible to abuse. [It may] alter the relationship between citizen and government in a way that is inimical to democratic society.” Do you agree? Why or why not?
    4. It is often suggested that we must balance liberties and security; in other words, the more open our society is, the more risk that someone might take advantage of that openness to do something criminal or dangerous. In the majority opinion on Arizona v. Hicks (1987), a case often referenced in the Apple and FBI dispute over turning over the phone data of the alleged San Bernardino shooter, Justice Antonin Scalia wrote, “...there is nothing new in the realization that the Constitution sometimes insulates the criminality of a few in order to protect the privacy of us all.” Under what circumstances do you think the promise of security is worth the sacrifice of freedoms?
    5. In Riley v. California and United States v. Wurie (both 2014), Chief Justice John Roberts, on behalf of a unanimous decision, wrote, “Modern cell phones are not just another technological convenience. With all they contain and all they may reveal, they hold for many Americans ‘the privacies of life’ ... The fact that technology now allows an individual to carry such information in his hand does not make the information any less worthy of the protection for which the Founders fought. Our answer to the question of what police must do before searching a cell phone seized incident to an arrest is accordingly simple — get a warrant.” Now that you’ve read three opinions, how would you define the key aspects of an individual’s right to privacy with regard to cell phones?
    6. Your rights to privacy only cover content which you have not shared with a third party. Does this knowledge impact the choices you make, for example about what to purchase with a credit card or post on social media? What questions do you ask yourself before posting? How might you help students consider this?
    7. Consider the case of Apple v. FBI Federal agents asked Apple to override its security software and enable law enforcement access to the San Bernardino shooter Syed Rizwan Farook’s iPhone. Apple refused, stating that such a security work-around would set a dangerous precedent encroaching on civil liberties. The suit was dropped when the FBI gained access to the phone with the help of an independent contractor. Would you want the manufacturer of your smartphone to be giving away your personal information?
    8. The core dispute is over whether technology companies should be required to build “backdoors” into their software that allow warranted searches. Why might technology companies be unwilling to do so? What is the value of data encryption? Do work-arounds undermine that value?
    9. What do you think of the solution that was reached in Apple v. FBI? Once there is a search warrant, is it justifiable for U.S. law enforcement to use hacking to gain access to a device locked by encryption? Do you believe that technology companies should be compelled to assist in government access to the devices they produce?

    12. Intergenerational differences and data privacy: Generational shift or developmental stage?

    The popular press has suggested that Millennials and their younger siblings, Gen Zs, are relatively cavalier about protecting their data privacy online. But is this accusation really justified? Gen Z (birth years 1996 to present) are cloud natives in addition to being digital natives (Center for Intergenerational Kinetics 2016). They have grown up using cell phone apps and social media. More pragmatic and strategic users of social media, Gen Zs have experienced cyberbullying as well as the benefits and celebrity associated with their own content creation. Use of social media is suggested as a key differentiator for Gen Z, who are more concerned about online privacy than Millennials but less so than Gen X and Boomers.

    Gen Zers prefer fleeting and anonymous apps like Instagram and Snapchat, and place their trust in social media influencers. They are more likely to share personal information based on the online recommendation of celebrities, organizations, and affinity groups. Controlling for differences in rates of social media use among Boomers, GenXers, and GenZers, data privacy concerns may not be so different. Boomers say they are more cautions about sharing personal information, but their online behavior may actually be very similar when on social media. Gen Z seems generally more cautious about most online data requests, but they are willing to share data for more relevant ads. How can educators help them to understand how online “influencers” are leading them to sacrifice data privacy?

    Resources

    Discussion questions

    1. How can we explore students’ social media and their understanding of data privacy? Develop a lesson plan that either incorporates social media use, or uses scenario-based examples to encourage students to explore what they will share online.
    2. How does Gen Z’s comfort with online payment apps, like Venmo, change our obligation to teach financial literacy? What are the challenges?
    3. How do Gen Z’s interests in environmental and social justice causes affect their decisions to share personal information? Have you seen examples of this with school projects or fundraisers for causes they care about?
    4. The Murnane article suggests privacy concerns are disregarded when grades or school success are concerned. Do students see the association between apps that track homework habits or reading completion and deeper intrusions into their personal data collection? On a continuum, how do these intrusions rank with data collected on their health and shopping habits?
    5. As teachers, discuss your personal privacy concerns as Millennials, Gen Xers or Baby Boomers. On what kinds of apps or online destinations are you most likely to share information? How might these be shared and used with your students? Do students identify sites that use encryption when profiling users? Do you?
    6. If influencers’ endorsements convince Gen Zers to trust privacy protections more, ask teens to reflect on their top influencers of the past month. Consider dividing into categories like music, clothing, technology gadgets, etc.
    7. If research suggests Gen Z will trade personal data for more targeted ads, how can we help them become more aware of their consumer profile? In other words, when is it worth giving up your data? Do students measure the cost in terms of dollars, products, prestige, prominence, or something else?
    8. Review Facebook’s privacy changes (see first resource, page 117) from 2005 to 2010 with students, then have them read and review the current policy, using the “Privacy Basics” page for understanding. Review the “Photos” or “Likes and Comments” sections with a discussion group. Which parts of the policy are surprising? Why? If your conversation is a professional development activity with teachers, how might we promote greater understanding of data “leaks” with our students?

    13. Comparing United States and European Union approaches to privacy

    One way to assess the laws and regulations governing the collection and use of data about individuals in the United States is to compare our practices to those of another system. The European Union provides a case study for comparison. Although several factors distinguish the governance of the EU and the U.S., we share many values and democratic practices.

    In the U.S., laws have emerged over time to govern specific sectors, such as healthcare or banking. This enables more regulatory flexibility, but can create loopholes or undermine equity in personal data security. In the EU data protection is considered a fundamental right that applies regardless of sector or industry. This creates more uniformity which can simplify implementation, but has also raised concerns around freedom of the press, and critics argue that it stifles business and innovation.

    How far is too far to go in protecting individual’s rights? What are the differences and the resulting benefits and drawbacks of these two contrasting systems?

    Resources

    Discussion questions

    1. It is often argued that differing attitudes about privacy across the Atlantic can be traced to the events of World War II, when invasion of personal privacy played a pernicious role in totalitarian social control. In the article from NBCnews.com above, correspondent Bob Sullivan argues that the divergence in policy between the U.S. and the EU can be traced to another cultural difference: Europeans generally trust government more than corporations; Americans do not, and so do not empower the government to adequately regulate corporate practices with regard to personal data. Do you find either of these arguments compelling? Which do you think sheds more light on our present-day differences? What other reasons do you see for the variance in our regulatory systems and our attitudes toward personal data privacy?
    2. Many EU privacy laws stem from The European Union Directive on Data Protection of 1995, which sets out a basic philosophy privileging privacy. Regulations in the U.S. tend to be industry specific, with different laws applying, for example, to credit card companies than to health care providers. What do you see as the benefits and drawbacks of each of these regulatory approaches?
    3. Consider the controversial EU law popularly known as the Right to be Forgotten, by which individuals can file a request with a search engine to eliminate specific results from appearing when someone searches for their name. If private individuals have a right to request that search engines adjust their results, what are the implications for censorship and free access to information? Conversely, is it acceptable that Americans have no recourse with regards to inaccurate or damaging information that might be posted about them without their consent?
    4. U.S. privacy regulations are sometimes driven by court cases or action at the state level. For example, the 2003 data breach notification law passed in the state of California requires companies to tell consumers when their personal information has been lost or stolen. Similar laws have subsequently been enacted in other states and have resulted in about 90 million consumer notifications. Given that the U.S. population is estimated at approximately 325 million, 225 million of whom are adults, this means that approximately 40% of adult Americans have been notified of a breach involving their personal data. In your opinion, is the state- or case-driven model a good one to serve as a basis for U.S. policy? What problems could arise from this approach?
    5. Many policies in force at the EU level are not applied uniformly across member nations. Further, some policies are so restrictive that it’s common to violate them; consistent enforcement may not be feasible. Considering the challenges of establishing a single policy for multiple nations, what is the value of privacy recommendations at the EU level? Could inconsistent enforcement undermine the integrity of legal guidelines overall?
    6. The US and EU have similar regulations designed to protect children. In the US, the Children’s Online Privacy Protection Act (COPPA) bans collection of personal data, such as that gathered by social networking sites, from people under 13. Based on your experience either with students or with young people in your personal life, how effective is COPPA? Does it limit exposure of young people to unauthorized data collection? What do you see as the weaknesses of this law? Does it go too far or not far enough? What challenges are inherent in regulating access by age in online spaces?

    14. Be strategic! Reading and understanding terms of service and privacy policies

    Have you ever tried reading a website’s privacy policy or terms of service? You know, the one that you click to say that you have read before signing up for a service. According to Pew Research Center, in 2014, 52% of Internet users believed that if a site had a privacy policy that meant the site was keeping user data confidential. In fact, policies may state that anyone using the service gives permission for the owners to access, use, and sell personal data.

    But how are we supposed to understand those long, complex, and – let’s face it – boring documents? Here are some resources to give you a hand.

    Resources

    Discussion questions

    1. After reading (and trying out) the resources above, discuss which services you encountered. Make a list in small groups or as a class.
    2. What terms of service/privacy policies feel like they are in place to protect you? Do you think they do that well?
    3. What terms of service/privacy policies feel like they are in place to benefit the organization at your expense? Why do you feel that way?
    4. Which of the terms that benefit the organization more than you do you feel are reasonable, given that you want particular services to exist? Which do you think are invasive or otherwise problematic for the user?
    5. Modeled on Bailey’s “How to Read a Privacy Policy,” make up a list of words you might search for (using control+F or CTRL+F keystrokes on your computer’s keyboard) within a privacy policy/terms of service document to protect your own privacy.
    6. The resource list above includes numerous tools for helping you understand what is happening with personal information you give to services you use. Which of them do you like best? Why?
    7. “Terms of Service; Didn’t Read” (ToS;DR) is a crowdsourced tool for helping the public understand terms of service. Because it is crowdsourced (meaning that it relies on volunteer contributions, not scheduled updates by paid staff), it is not always possible to update it immediately when an organization changes its terms. Find the page on ToS;DR for a service you use, and find that organization’s terms of service. What needs updating on the ToS;DR website? Create your own document with updated policies for your selected site to share with your class.

    15. What does Cambridge Analytica have about you?

    Cambridge Analytica has constructed a database of 230 million American adults, with up to 5,000 pieces of demographic, consumer and lifestyle information for each. Information on file might include any or all of the following:

    • voting histories (meaning whether or not you voted; your actual voting decisions are known only to you)
    • age
    • income
    • debt
    • hobbies
    • criminal history
    • purchases and consumer history
    • religious leanings
    • health concerns
    • gun ownership
    • car ownership
    • home ownership

    Supplementing those data points is psychological information users may have shared through quizzes on social media. In both the 2016 Brexit vote and the U.S. presidential election the same year, Cambridge Analytica, a privately-held company, made its data available to campaign organizers to facilitate virtual and geographic microtargeting. In the presidential election, datasets and sophisticated software algorithms were used to place 4,000 differentiated online ads varying in minute detail, including different headings, colors, captions, or which photo or video was shown to a potential voter.

    Cambridge Analytica’s insights were also used by Trump campaign canvassers on the ground. An app identified the political views and personality types of the inhabitants of a home so that Trump canvassers, prepared with talking points tailored to the resident, only rang at the doors app rated as receptive to his messages.

    Highly-targeted marketing played a large role on behalf of conservative candidates and causes in 2016, but it is not exclusive to conservatives. Homegrown data collection systems were developed by the 2008 and 2012 Obama campaigns to help use campaign and fundraising efforts as efficiently and effectively as possible.

    Resources

    Discussion questions

    1. The U.K. and Europe have strict privacy protections limiting the use of personal information, but U.S. data brokers have both broad access to local and state government records and to troves of consumer information available to any company or candidate who can afford them. Who might each approach benefit?
    2. Cambridge Analytica treats data points it has collected as proprietary information, voicing reluctance to reveal specifics in case its intellectual property is reverse-engineered. However, it has provided examples including mapping the types of music an individual listens to on Pandora to those using a particular Snapchat filter. What are some ways you can avoid cross-platform tracking?
    3. On the basis of an average of 68 Facebook “likes” by a user, it was possible to predict their race (95% accuracy), sexual orientation (88% accuracy), and Democratic or Republican party affiliation (85% accuracy) (Kosinski, Stillwell & Graepel, 2013). Will this change the way you interact with social networks?
    4. Cambridge Analytica’s model relies upon the OCEAN theory of personality — openness, conscientiousness, extroversion, agreeableness, and neuroticism. Are those traits static or could they change over time? Are there other traits that you think would help Cambridge Analytica better know a consumer?
    5. How could your quality of life could improve by allowing companies to better tailor the communications you receive?
    6. According to the U.S. Census Bureau (census.gov/quickfacts), there were approximately 234 million adults in the United States in 2014. How does knowing that statistical benchmark help you gain understanding of the number of citizen profiles Cambridge Analytica has (approximately 230 million)? How does that knowledge change or reinforce your opinions on Cambridge Analytica’s scope or practices?
    7. Cambridge Analytica will sell its data, but the data is expensive. What impact might its data have on an election in which a very wealthy, well-funded candidate who can afford Cambridge Analytica’s data is pitched against one with fewer financial means?