Author(s):

  • Reijneveld, Minke D.

Abstract:

The General Data Protection Regulation (GDPR) will be applied from May 2018. One of the many new societal developments it has to deal with is the Quantified Self (QS). This concerns data that are collected about a person by apps that aim to improve his or her life. This article answers the question to what extent the tools and assumptions that underlie the creation of QS influence an individual’s freedom and to what extent the GDPR can contribute to the protection of this freedom.

The article finds that QS can restrict an individual’s internal and external freedom. It suggests that everybody should meet a certain standard or group norm, which influences the choices individuals make. This is an internal restriction of freedom, which is largely unknown. A more familiar problem is the external restriction of freedom. This happens when data are analysed by the QS app or by third parties. They can make assumptions about a person on the basis of these data, which influences the possible options for an individual.

The GDPR does protect certain elements of external freedom better than the EDPD. This mainly has to do with the rules related to data about health, and more stringent rules in general. The GDPR does not protect the internal aspect of freedom, although the possible risks of this internal restriction can be very serious.

Documentation:

https://doi.org/10.2966/scrip.140217.285

References:

[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L 119/1 (hereinafter ‘GDPR’).

[2] Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data [1995] OJ L 281 (hereinafter ‘EDPD’).

[3] See “GDPR Portal: Site Overview” available at http://www.eugdpr.org/ (accessed 5 December 2017).

[4] This will be on 25 May 2018.

[5] GDPR, rec. 7.

[6] Ibid., rec. 11.

[7] Ibid., rec. 10.

[8] Francesca Bignami, “Privacy and Law Enforcement in the European Union: The Data Retention Directive” (2007) 8(1) Chicago Journal of International Law 233 – 255, p. 233.

[9] According to the EU’s Treaty of Lisbon, the EU is required to accede the ECHR (Consolidated Version of the Treaty on the Functioning of the European Union (TFEU) [2010] OJ C 83/47 art. 6). However, on 18 December 2014, the Court of Justice issued a negative opinion on the accession of the EU to the ECHR (Opinion 2/13 [2014]).

[10] Supra n. 8, pp. 241-242.

[11] TFEU supra n. 9, art. 16.

[12] International Covenant on Civil and Political Rights Adopted and opened for signature, ratification and accession by General Assembly resolution 2200A (XXI) of 16 December 1966 entry into force 23 March 1976, in accordance with Article 49.

[13] Francesca Bignami, “Transgovernmental Networks vs. Democracy: The Case of the European Information Privacy Network” (2005) 26(565) The Michigan Journal of International Law 807-870, pp. 813-819.

[14] EDPD, rec. 10.

[15] Gerrit Hornung, “A General Data Protection Regulation for Europe? Light and Shade in the Commission’s Draft of 25 January 2012” (2012) 9(1) SCRIPTed 64-81.

[16] Interinstitutional File 2012/0011 (COD), Presidency to the Council, 11 June 2015, 9565/15, Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) – Preparation of a general approach.

[17] See “GDPR Key Changes” available at http://www.eugdpr.org/key-changes.html (accessed 5 December 2017).

[18] Google was founded on 4 September 1998 by Larry Page and Sergey Brin.

[19] Facebook was launched on 4 February 2004 by Mark Zuckerberg and Eduardo Saverin.

[20] In 1999, six companies together created the Wireless Ethernet Compatibility Alliance. They decided to call their body Wi-Fi, and thus Wi-Fi was created in 1999. For more information, see: The Economist, “A brief history of Wi-Fi” (2004) The Economist, Technology Quarterly Q2, available at http://www.economist.com/node/2724397 (accessed 5 December 2017).

[21] Dominik Leibenger, Frederik Möllers, Anna Petrlic, Ronald Petrlic and Christoph Sorge, “Privacy Challenges in the Quantified Self Movement – An EU Perspective” (2016) 4 Proceedings on Privacy Enhancing Technologies 315-334.

[22] Ibid.

[23] Gary Wolf, “What is the Quantified Self?” [2011] available at http://quantifiedself.com/2011/03/what-is-the-quantified-self/ (accessed 5 December 2017). The Quantified Self Company provides users a community and it organises amongst other things meetings, conferences, forums, web content, and a guide.

[24] See “Quantified Self: self knowledge through numbers” available at http://quantifiedself.com(accessed 5 December 2017).

[25] Deborah Lupton, “Understanding the Human Machine” (2013) 32(4) IEEE Technology and Society Magazine 25-30, p. 25.

[26] Mario Ballano Barcena, Candid Wueest, and Hon Lau, “How Safe is Your Quantified Self?” (2014) Technical Report Symantec 1-38.

[27] Margreet Riphagen et al., “Learning Tomorrow: Visualising Student and Staff’s Daily Activities and Reflect on it” (2013) ICERI 2013 conference proceedings, p. 1.

[28] Melanie Swan, “The Quantified Self: Fundamental Disruption in Big Data Science and Biological Discovery” (2013) 1(2) Big Data 85-99, p. 85.

[29] Minna Ruckenstein and Mika Pantzar, “Beyond the Quantified Self: Thematic

Exploration of a Dataistic Paradigm” (2015) 19(3) New Media & Society 1-18, p. 3.

[30] Supra n. 27, p. 2.

[31] Supra n. 28, p. 85.

[32] Supra n. 27, p. 3.

[33] See “track your mood & get anonymous support” available at http://moodpanda.com (accessed 5 December 2017).

[34] See “Everyone. Every run” available at https://runkeeper.com/index (accessed 5 December 2017).

[35] See “Weight loss that fits” available at: https://www.loseit.com/ (accessed 5 December 2017).

[36] Supra n. 28, p. 85; supra n. 29, p. 2.

[37] See “Strava” available at https://www.strava.com/ (accessed 5 December 2017).

[38] See for an overview of all possibilities of Strava: “Features” available at https://www.strava.com/features (accessed 5 December 2017).

[39] Ibid.

[40] See Mike Wehner, “Strava Begins Selling Our Data Points, and No, You Can’t Opt-out” [2014] engadget, available at https://www.engadget.com/2014/05/23/strava-begins-selling-your-data-points-in-the-hopes-of-creating/ (accessed 5 December 2017).

[41] Alan Westin, Privacy and Freedom (London: The Bodley Head, 1967) p. 23.

[42] Edward Eberle, “Human Dignity, Privacy, and Personality in German and American Constitutional Law” (1997) 4 Utah Law Review 963-1056, p. 964.

[43] Ibid, p. 965. This is also very clearly a Kantian idea.

[44] Daniel Solove, Understanding Privacy (Cambridge: Harvard University Press, 2008) p. 1.

[45] Supra n. 41, p. 7.

[46] Auto (αὐτο) means self and nomos (νόμος) means law.

[47] Paul Guyer (ed), The Cambridge Companion to Kant and Modern Philosophy (Cambridge: Cambridge University Press, 2006) p. 345.

[48] Rudolf Steiner, The Philosophy of Freedom, (Rudolf Hoernlé tr, Susses: Rudolf Steiner Press, 1916) p. 40.

[49] Ibid.

[50] Robert Johnson and Adam Cureton, “Kant’s Moral Philosophy” [2016] Stanford Encyclopedia of Philosophy, available at https://plato.stanford.edu/entries/kant-moral/ (accessed 5 December 2017).

[51] Immanuel Kant, Groundwork of the Metaphysics of Morals (Thomas Abbott tr, London: Longmans, Green and co, 1895) at 6:214.

[52] Immanuel Kant, Kritik der Praktischen Vernunft (Riga: Hartknoch, 1788) ch 1.

[53] Supra n. 51.

[54] Ibid., at 4:421.

[55] Ibid., at 4:429.

[56] Ibid., at 4:431.

[57] Supra n. 41, p. 7.

[58] Supra n. 44, p. 2.

[59] GDPR, art. 1(1).

[60] Ibid., art. 4(1).

[61] Ibid.

[62] Ibid., rec. 26.

[63] Ibid., art. 2(1).

[64] Ibid., art. 2(2)(c).

[65] Ibid., art. 3(1).

[66] Supra n. 21, p. 318.

[67] GDPR, art. 6.

[68] EDPD, art. 7.

[69] GDPR, art. 6.

[70] Claudia Quelle, “Not Just User Control in the General Data Protection Regulation. On Controller Responsibility and How to Evaluate Its Suitability to Achieve Fundamental Rights Protection”, in Anja Lehmann, Dianne Whitehouse, Simone Fischer Hübner, Lothar Fritsch and Charles Raab (eds) Privacy and Identity Management. Facing up to Next Steps (IFIP Summer School 2016, Berlin: Heidelberg, 2017) p. 4.

[71] GDPR, art. 4(11).

[72] See: GDPR, Recital 32.

[73] Supra n. 70, p. 4.

[74] GDPR, art. 7(4).

[75] GDPR, Recital 43; supra n. 70, p. 4.

[76] For example the ePrivacy Directive does not require unambiguous consent at the moment, because the EDPD does not define consent as unambiguously in its definition of consent.

[77] See: GDPR, art. 8.

[78] See: Ibid., art. 9, with exceptions in 9(2)(a) and 9(2)(e).

[79] Ibid., art. 4(15).

[80] Supra n. 21, p. 318.

[81] GDPR, art. 9(2)(a).

[82] Supra n. 21, p. 318.

[83] Ibid., p. 317.

[84] Ibid. 

[85] See: GDPR, art. 5(1)(b).

[86] Often, this will be an individualised comparison: for example, only the running speed of other female users between 20 and 25 will be compared with your data.

[87] Supra n. 26, p 10.

[88] Katleen Gabriels, “I Keep a Close Watch on this Child of Mine” (2016) 18(3) Ethics and Information Technology 175-184, p. 175.

[89] Edwin Locke and Gary Latham, “The Application of Goal Setting to Sports” (1985) 7 Journal of Sport Psychology 205-222, p. 206.

[90] Supra n. 26.

[91] GDPR, art. 6.

[92] See for example Harald Gjermundrød, Ioanna Dionysiou, and Kyriakos Costa, “PrivacyTracker: A Privacy-by-Design GDPR-Compliant Framework with Verifiable Data Traceability Controls” in Sven Casteleyn, Peter Dolog and Cesare Pautasso (eds) Current Trends in Web Engineering, (ICWE 2016 Workshops, Berlin: Springer International Publishing, 2016) pp. 3-15.

[93] See “MoodPanda Privacy Policy” available at http://moodpanda.com/privacy.aspx (accessed 5 December 2017).

[94] See “RunKeeper Privacy Policy” available at https://runkeeper.com/privacypolicy (accessed 5 December 2017).

[95] Ibid.

[96] Ibid. The only thing that is mentioned about other users is “to enable social-sharing, to find your friends, […] to allow you to communicate and interact with other users”.

[97] Supra n. 41, p. 13.

[98] Daniel Feldman, “The Development and Enforcement of Group Norms”, (1984) 9(1) The Academy of Management Review 47-53, p. 47; Richard Hackman, “Group influences on individuals” in Marvin Dunnette (ed) Handbook of Industrial and Organizational Psychology, (Chicago: Rand McNally, 1976) pp. 1455-1525.

[99] Supra n. 26, p. 6.

[100] Supra n. 25, p. 27-28.

[101] Ibid.

[102] For example in Strava, you can view your performance after running or cycling and then choose whether or not you want to share this with your friends and followers.

[103] See for example supra n. 21, p. 316.

[104] Supra n. 89, p. 207.

[105] Ibid.

[106] Ibid.

[107] Supra n. 98, p. 49. For more information, see Daniel Katz and Robert Kahn, The Social Psychology of Organizations (New York: Wiley, 1978).

[108] Bart Schermer, “The Limits of Privacy in Automated Profiling and Data Mining” (2011) 27(1) Computer Law & Security Review, 45-52, p. 45.

[109] Supra n. 15, p. 69.

[110] Solomon Asch, “Effects of Group Pressures Upon the Modification and Distortion of Judgments” in Greg Swanson, Theodore Newcomb, and Edward Hartley (eds.) Readings in Social Psychology, (New York: Holt, Reinhart & Winston, 1952) pp. 393-401; John Turner, Social Influence, (Milton Keynes: University Open Press, 1991).

[111] Matthew Hornsey Louise Majkut, Deborah Terry and Blake McKimmie, “On Being Loud and Proud: Non-Conformity and Counter-Conformity to Group Norms” (2003) 42(3) The British Psychological Society 319-335.

[112] Ibid, p. 320.

[113] See for a good example the research done by Deutsch and Gerard in 1955, where participants were required to judge the length of two lines. Some respondents were instructed to give the wrong answer. The study suggested that the pressure to comply with the majority was very high for participants who were not aware of the fact that some respondents were instructed to do so (see: Morton Deutsch and Harold Gerard, “A Study of Normative and Informational Social Influences Upon Individual Judgment” (1955) 51(3) Journal of Abnormal and Social Psychology 629-636). Various other researches have shown that people are not willing to speak out to the majority in general.

[114] Deborah Lupton, “Food, Risk and Subjectivity” in Simon Johnson Williams, Jonathan Gabe and Michael Calnan (eds) Health, Medicine, and Society. Key Theories, Future Agendas (London: Routledge, 2000) pp. 205-217.

[115] Cristian Rangel, Steven Dukeshire and Letitia MacDonald, “Diet and Anxiety. An Exploration into the Orthorexic Society” (2012) 58(1) Appetite 124-132, p. 124.

[116] Guido Nicolosi, “Biotechnologies, Alimentary Fears and the Orthorexic Society” (2006) 2(3) Tailoring Biotechnologies 37–56.

[117] Frederik Zuiderveen Borgesius, Improving Privacy Protection in the Area of Behavioural Targeting (2015) available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2654213 (accessed 5 December 2017); Avi Goldfard and Catherine Tucker, “Online Advertising, Behavioral Targeting, and Privacy” (2011) 54(5) Communications of the ACM 25-27.

[118] Frederik Zuiderveen Borgesius et al., “Should We Worry About Filter Bubbles?”(2016) 5(1) Internet Policy Review: Journal on Internet Regulation 1-16; Eli Pariser, The Filter Bubble: What the Internet is Hiding from You (London: Penguin Press, 2011).

[119] GDPR, art. 15(1)(h).

[120] Sandra Wachter, Brent Mittelstadt and Luciano Floridi, “Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation” [2017] International Data Privacy Law 76-99.

[121] The concepts of consent and purpose limitations have been discussed in a plethora of works. These include: Menno Mostert, Annelien Bredenoord, Monique Biesaart and Johannes van Delden, “Big Data in Medical Research and EU Data Protection Law: Challenges to the Consent or Anonymise Approach” (2016) 24 European Journal of Human Genetics 956-960; Beata Safari, “Intangible Privacy Rights: How Europe’s GDPR Will Set a New Global Standard for Personal Data Protection” (2017) 47(3) Seton Hall Law Review 809-848; Tal Zarsky, “Incompatible: The GDPR in the Age of Big Data” (2017) 47(2) Seton Hall Law Review 995-2012.

[122] Sander Voerman, “Health Coaches” in Linda Kool et al (eds) Sincere Support. The Rise of the E-coach, (The Hague: Rathenau Instituut, 2015) p. 41.

[123] Lisa Guerin, “Is Your Employee Wellness Program Legal?” available at http://labor-employment-law.lawyers.com/human-resources-law/wellness-programs-may-be-bad-for-employers-health.html(accessed 5 December 2017); Soeren Mattke et al., Workplace Wellness Programs Study (Final Report, Santa Monica, CA: RAND Corporation, 2013).

[124] Michelle Mello and Meredith Rosenthal, “Wellness Programs and Lifestyle Discrimination – The Legal Limits” (2008) 359 The New England Journal of Medicine 192-199.

[125] Ibid.

[126] Wolf Kirsten, “Making the Link between Health and Productivity at the Workplace —A Global Perspective” (2010) 48 Industrial Health 251-255, p. 254.

[127] See for example “Daimler Speicherte Heimlich Krankendaten” (2009) Der Tagesspiegel available at http://www.tagesspiegel.de/wirtschaft/neuer-datenskandal-daimler-speicherte-heimlich-krankendaten/1497042.html (accessed 5 December 2017).

[128] Marco Guazzi et al., “Worksite Health and Wellness in the European Union” (2014) 56(5) Progress in Cardiovascular Diseases 508-514

[129] Ibid, p. 510.

[130] Supra n. 26.

[131] Ibid. 

[132] Paul Schwartz, “Property, Privacy, and Personal Data” (2004) 117(7) Harvard Law Review 2055-2128, p. 2058.

[133] The Economist, “China invents the digital totalitarian state” (2016) The Economist available at https://www.economist.com/news/briefing/21711902-worrying-implications-its-social-credit-project-china-invents-digital-totalitarian (accessed 5 December 2017).

[134] Ibid. 

[135] Primavera De Filippi, “Big Data, Big Responsibilities” (2014) 3(1) Internet Policy Review 1-12.

[136] Morgane Remy, “Personal Data: What if Tomorrow Your Insurance Company Controlled Your Lifestyle?” [2016] Multinationals Observatory.

[137] Sourya De and Daniel Le Métayer, “PRIAM: A Privacy Risk Analysis Methodology (Research Report)” [2016] RR-8876, Inria – Research Centre Grenoble, Rhône-Alpes (hal-01302541). For an example, see Tara Siegel Bernard, “Given Out Private Data for Discount in Insurance” [2015] The New York Times available at https://www.nytimes.com/2015/04/08/your-money/giving-out-private-data-for-discount-in-insurance.html (accessed 5 December 2017)

[138] Supra n. 136.

[139] See for example: Rajindra Adhikari, Karen Scott, and Deborah Richards, “Security and Privacy Issues Related to the Use of Mobile Health Apps”, (2014) paper presented at the 25th Australasian Conference on Information Systems mHealth App Privacy and Security Issues 8th-10th Dec 2014, Auckland, New Zealand, available at http://www.colleaga.org/sites/default/files/attachments/acis20140_submission_12.pdf25th(accessed 5 December 2017)Hamed Haddadi, Akram Alomainy and Ian Brown, “Quantified Self and the Privacy Challenge in Wearables” [2014] The IT Law Community; Deborah Lupton, “Quantified Sex: a Critical Analysis of Sexual and Reproductive Self-Tracking Using Apps” (2015) 17(4) Culture, Health & Sexuality 440-453.

[140] Bari Faudree and Mark Ford, “Security and Privacy in Mobile Health” [2013] CIO Journal available at http://deloitte.wsj.com/cio/2013/08/06/security-and-privacy-in-mobile-health/.

[141] Supra n. 132, p. 2055: “Personal information is an important currency in the new millennium”.

[142] “If you are not paying for it, you’re not the customer; you’re the product being sold” – by Andrew Lewis available at https://twitter.com/andlewis/status/24380177712 (accessed 5 December 2017).

[143] Tracey Caldwell, “The Quantified Self: a Threat to Enterprise Security?” (2014) 11 Computer Fraud & Security 16-20, p. 17.

[144] EDPD, art. 6(1)(b).

[145] Michael McCarthy, “Experts Warn on Data Security in Health and Fitness Apps” (2013) BMJ 347.

[146] EDPD, art. 17(1).

[147] Ibid., art. 16.

[148] Ibid., art. 17(1).

[149] Kuan Hon, “Data Security Developments under the General Data Protection Regulation” (2015) LexisNexis, World of IP and IT law.

[150] GDPR, art. 25.

[151] Ibid., art. 32.

[152] Charles Duhigg, “Psst, You in Aisle 5” [2012] New York Times, § 6 (Magazine) available at http://www.nytimes.com/2012/03/04/magazine/reply-all-consumer-behavior.html (accessed 5 December 2017) p. 30.

[153] Charles Duhigg, “How Companies Learn Your Secrets” [2012] New York Times available at http://www.nytimes.com/2012/02/19/magazine/shopping-habits.html (accessed 5 December 2017).

[154] Kate Crawford and Jason Schultz, “Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms” (2014) 55(1) Boston College Law Review 93-128, p. 98.

[155] Supra n. 153.

[156] Ibid.

[157] GDPR, art. 6(4).

[158] Nicolas Terry, “Protecting Patient Privacy in the Age of Big Data”, (2012) 81(2) UMKC Law Review385-415, p. 394.

[159] Ibid. 

[160] See for example: Ari Juels, “Targeted Advertising … and Privacy Too” in David Naccache (ed) Topics in Cryptology – CT-RSA 2001, Lecture Notes in Computer Sciences (Berlin: Springer, 2001); Catherine Tucker, “Social Networks, Personalized Advertising, and Privacy Controls” (2014) 51(5) Journal of Marketing Research 546 – 562; Hamed Haddadi et al.,“Targeted Advertising on the Handset: Privacy and Security Challenges” in Jörg Müller, Florian Alt and Daniel Michelis (eds) Pervasive Advertising (London: Springer-Verlag, 2001) 119-137.

[161] Supra n. 154, p. 98.

[162] See for an article going in depth on the topic of consent: Daniel Solove, “Privacy Self-Management and the Consent Dilemma” (2013) 126 Harvard Law Review 1880-1903.

[163] Eve Caudill, Patrick Murphy, “Consumer Online Privacy: Legal and Ethical Issues” (2000) 19(1) Journal of Public Policy & Marketing 7-19.

[164] Simson Garfinkel and Gene Spafford, Web Security, Privacy, and Commerce (Sebastopol: O’Reilly Media, 2002).

[165] Cesare Bartolini et al., “Assessing IT Security Standards Against the Upcoming GDPR for Cloud Systems” [2015] Presentation at Grande Region Security and Reliability Day 2015.

[166] Omer Tene and Jules Polonetsky, “Privacy in the Age of Big Data. A Time for Big Decisions” [2012] Standford Law Review, available at https://www.stanfordlawreview.org/online/privacy-paradox-privacy-and-big-data/ (accessed 5 December 2017).

[167] Paul Ohm, “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization” (2010) 57 UCLA Law Review, 1701 – 1765; Arvind Narayanan and Vitaly Shmatikov “Robust De-anonymization of Large Sparse Datasets” [2008] Proceedings of IEEE Symposium on Security & Privacy 111-125.

The SELF Institute