The consequence of the data economy, with its growing need for personal data for profit, is a significant loss of personal information privacy, defined as the rights of individuals to control the circulation of information related to them (Slattery & Krawitz, 2014). Informational privacy has long been recognized as a fundamental human right, with personally identifiable information as the legal threshold condition for the loss of anonymity or privacy (Schwartz & Solove, 2011).
The primary trade-off for the user is relinquishing control over their personal information, in exchange for access to information, services, efficiency and a personalization of experiences. The information is then used by website owners as a means to generate revenue from advertisement as a way to offset the costs of providing this access (Evans, 2009).
Advertisers and website owners justify this exchange by emphasizing the benefits to the end user of giving up this data, such as relevant advertisement rather than randomized content of little interest. In the words of Evans (2009), given that a website is going to display an advertisement, consumers might prefer that the advertisement be more relevant than not.
At first glance, the exchange might not seem very different to the one seen on traditional radio or television advertisement. However, digital media has radically changed the flows of personal information (Nissenbaum, 2011) with mediated disruptions of an unprecedented scale and variety.
The mechanisms of data collection, as explained on part two of this paper, paired with the increasing value of this data and the lack of understanding on the part of consumers of the intricacies of which information exactly they are disclosing about themselves, raises serious concerns about the fairness and legally and ethics behind these data collection practices (Warner & Sloan, 2012).
Current data collection technologies have been recent source of concern by policymakers, public-interest advocates, and the media who have raised awareness of the pervasive, manipulative and surreptitious data collection practices of the current media landscape and their consequences on individuals and society as a whole (Nissenbaum, 2011), such as with the widely publicized case of Facebook and Cambridge Analytica.
Due in part to these alarms being raised in the eye of public opinion, data collection and behavioral advertisement have attracted lawsuits and legislative inquiries (Evans, 2009) escalating to the passing of legislation such as the GDPR in Europe (Strycharz et al., 2021), state legislation in the USA and Australian legislation (Slattery & Krawitz, 2014).
Unfortunately, as it was made plainly visible by the recent congressional hearings where social media owners were questioned about their online practices and closely scrutinized, the rapid advances in data processing technologies, paired with the slow pace of the legislative process and the lack of technological understanding by policymakers, has rendered current legislation inadequate to handle the rapid pace of the industry.
While the existence of legislation such as GDPR has been seen as a move towards establishing the user as the default owner of their personal data, confusion about implementation and scope (Degeling et al., 2019), and the inevitable loopholes have allowed for websites to be apparently compliant with current regulation while, in reality, deceiving and manipulating users into parting with their personal data.
The literature in the field of online privacy and security has identified a growing concern amongst internet users about their privacy, personal data and how it is used in the data market. A strong desire for control over personal data is identified, but also a lack of action to secure this control, named the privacy paradox by Barth and de Jong (2017), which responds to failures in risk benefit analysis aided by common cognitive biases and heuristic.
- Barth, S., & de Jong, M. D. . (2017). The privacy paradox – Investigating discrepancies between expressed privacy concerns and actual online behavior – A systematic literature review. Telematics and Informatics, 34(7), 1038–1058. https://doi.org/10.1016/j.tele.2017.04.013
- Degeling, M., Utz, C., Lentzsch, C. et al. We Value Your Privacy … Now Take Some Cookies. Informatik Spektrum 42, 345–346 (2019). https://doi-org.proxy.library.ju.se/10.1007/s00287-019-01201-1
- Evans, D. S. (2009). The Online Advertising Industry: Economics, Evolution, and Privacy. The Journal of Economic Perspectives, 23(3), 37–60. https://doi.org/10.1257/jep.23.3.37
- Nissenbaum, H. (2011). A Contextual Approach to Privacy Online. Daedalus (Cambridge, Mass.), 140(4), 32–48. https://doi.org/10.1162/DAED_a_00113
- Schwartz, P., & Solove, D. (2011). The PII problem: Privacy and a new concept of personally identifiable information. NYU Law Review, 86, 1814.
- Slattery, R., & Krawitz, M. (2014). Mark Zuckerberg, the Cookie Monster : Australian privacy law and internet cookies. Flinders Law Journal, 16(1), 1–41.
- Strycharz, J., Smit, E., Helberger, N., & van Noort, G. (2021). No to cookies: Empowering impact of technical and legal knowledge on rejecting tracking cookies. Computers in Human Behavior, 120, 106750–. https://doi.org/10.1016/j.chb.2021.106750
- Warner, R., & Sloan, R. H. (2012). Behavioral advertising: from “One-Sided Chicken” to informational norms. Vanderbilt Journal of Entertainment and Technology Law, 15(1), 49–.
Note: The content of this blog post was written as part of the process of writing my thesis’ theoretical background, as a way to organize my ideas and clean them up in writing. The posts were temporarily taken down to prioritize the academic work’s originality, and have been reinstated after the publication of the thesis in the Swedish registry.