Mar 8, 2022 | Privacy & Security

Privacy Dark Patterns: Manipulating consent

Maria Arango Kure

Maria Arango Kure

Article Length:
711
words
Est. Reading Time:
4 minutes

On the fifth part of this series of posts, I talk about the topic of Dark Patterns, and how online platforms have found ways to stay within the existing regulation, while still manipulating and nudging users towards disclosing the highest possible amount of data.

Warner & Sloan (2012) and Nissenbaum (2011) make reference to the informed consent model of privacy, that lives at the core of the current privacy protection regulation, a combination of transparency and choice. This model is thought of as the ideal approach towards users regaining control over their personal data and being able to determine for themselves the best use of it and de cost benefit trade-off that suits their needs.

To a certain extent, web visitors have some control over their private information. Web browsers allow for the possibility of rejecting all cookies, cookie notices are in place to allow users, at least in theory, to make an informed choice of which types of cookies they allow when visiting a site and cookie policies are meant to inform and educate users about the full extent to which their data is processed, retained, and transferred once they agree to it being captured. Some people go through the steps in order to protect their privacy and prevent tracking, but almost all end up accepting the cookies as Kulyk et al. (2018) identify. Ignorance and a lack of obtrusive guards facilitate the acceptance of tracking and many remain unaware of the existing tracking technologies or are at least unclear about how they work.

Both Warner & Sloan (2012) and Nissenbaum (2011) identify issues with the transparency and choice model as it is applied in current online practices. In the eyes of the former, the issue remains in the realm viable alternatives to data collection, and the cost of declining the pay-with-data transactions.  The latter goes one step further and calls out the way these choices are framed and the ethical concerns about how these framing and costs subvert individuals so called freely chosen transactions.

This is where the concept of Dark Patterns comes into play. Coined by Harry Brignull (2010), dark patterns are defined as interface designs that nudge users towards behavior that is against their best interests. Often seen in sales transactions, Dark Patterns can be identified in cookie banners and notices and other privacy features of common websites (Hausner & Gertz, 2021) along with unnecessarily complex and confusing terms of service and privacy policies.

The objective of these dark patterns and confusing terms of service is to prevent users from declining data collection by inhibiting their ability to fully understand or engage with their choices. Through the use of Dark Patterns and other manipulative tactics, website owners and advertisers have been maintaining the power dynamics (Bornschein, 2020) in the data transactions to serve their best interests and incentivize data collection for monetization.

Short of some recently proposed regulation in some states in the USA, these Dark Patterns remain in the realm of legal practices, and only posing a threat to autonomous and informed consent via manipulation in a way that challenges ethics but would be difficult to prove in a court of law.

References

  • Bornschein, R., Schmidt, L., & Maier, E. (2020). The Effect of Consumers’ Perceived Power and Risk in Digital Information Privacy: The Example of Cookie Notices. Journal of Public Policy & Marketing, 39(2), 135–154. https://doi.org/10.1177/0743915620902143
  • Brignull, H. (2010).  Dark Patterns – User Interfaces Designed to Trick People, https://darkpatterns.org/.
  • Hausner, P., & Gertz, M. (2021). Dark Patterns in the Interaction with Cookie Banners. arXiv:2103.14956v1
  • Kulyk, O., Hilt, A., Gerber, N. & Volkamer, M. (2018). “This Website Uses Cookies”: users’ perceptions and reactions to the cookie disclaimer. In: Proceedings 3rd European Workshop on Usable Security. https://doi.org/10.14722/eurousec.2018.23012
  • Nissenbaum, H. (2011). A Contextual Approach to Privacy Online. Daedalus (Cambridge, Mass.), 140(4), 32–48. https://doi.org/10.1162/DAED_a_00113
  • Warner, R., & Sloan, R. H. (2012). Behavioral advertising: from “One-Sided Chicken” to informational norms. Vanderbilt Journal of Entertainment and Technology Law, 15(1), 49–.

Note: The content of this blog post was written as part of the process of writing my thesis’ theoretical background, as a way to organize my ideas and clean them up in writing. The posts were temporarily taken down to prioritize the academic work’s originality, and have been reinstated after the publication of the thesis in the Swedish registry.

0 Comments