Dark pattern

A dark pattern is "a user interface that has been carefully crafted to trick users into doing things, such as buying overpriced insurance with their purchase or signing up for recurring bills".[1][2][3] User experience designer Harry Brignull coined the neologism on 28 July 2010 with the registration of darkpatterns.org, a "pattern library with the specific goal of naming and shaming deceptive user interfaces".[4][5][6] More broadly, dark patterns supplant "user value...in favor of shareholder value".[7]

In 2021 the Electronic Frontier Foundation and Consumer Reports created a tip line to collect information about dark patterns from the public.[8]

PatternsEdit

Bait-and-switchEdit

Bait-and-switch patterns advertise a free (or at a greatly reduced price) product or service that is wholly unavailable or stocked in small quantities. After announcing the product's unavailability, the page presents similar products of higher prices or lesser quality.[9][10]

ConfirmshamingEdit

Confirmshaming uses shame to drive users to act. For example, when websites word an option to decline an email newsletter in a way that shames visitors into accepting.[10][11]

MisdirectionEdit

Common in software installers, misdirection presents the user with a button in the fashion of a typical continuation button. A dark pattern would show a prominent "I accept these terms" button asking the user to accept the terms of a program unrelated to the one they are trying to install.[12] Since the user typically will accept the terms by force of habit, the unrelated program can subsequently be installed. The installer's authors do this because the authors of the unrelated program pay for each installation that they procure. The alternative route in the installer, allowing the user to skip installing the unrelated program, is much less prominently displayed,[13] or seems counter-intuitive (such as declining the terms of service).

Some websites that ask for information that is not required also use misdirection. For example, one would fill out a username and password on one page, and after clicking the "next" button, the page asks the user for their email address with another "next" button as the only option.[14] This hides the option to press "next" without entering the information. In some cases, the page shows the method to skip the step as a small, greyed-out link instead of a button, so it does not stand out to the user.[15] Other examples include sites offering a way to invite friends by entering their email address, to upload a profile picture, or to identify interests.

Confusing wording may be also used to trick users into formally accepting an option which they believe has the opposite meaning, for example a personal data processing consent button with label "don't not sell my personal information".[16]

Roach motelEdit

A roach motel or a trammel net design provides an easy or straightforward path to get in but a difficult path to get out.[17] Examples include businesses that require subscribers to print and mail their opt-out or cancellation request.[9][10]

ResearchEdit

In 2016 and 2017 research has documented social media anti-privacy practices using dark patterns.[18][19] In 2018 the Norwegian Consumer Council (Forbrukerrådet) published "Deceived by Design", a report on deceptive user interface designs of Facebook, Google and Microsoft.[20] A 2019 study investigated practices on 11,000 shopping web sites. It identified 1818 dark patterns total and grouped them into 15 categories.[21]

Under the European Union General Data Protection Regulation (GDPR), all companies must obtain unambiguous, freely-given consent from customers before they collect and use ("process") their personally identifiable information. A 2020 study found that "big tech" companies often used deceptive user interfaces in order to discourage their users from opting out.[22]

LegalityEdit

Bait-and-switch is a form of fraud that violates US law.[23] In the European Union, the GDPR requires that a user's informed consent to processing of their personal information be unambiguous, freely-given, and specific to each usage of personal information. This is intended to prevent attempts to have users unknowingly accept all data processing by default (which violates the regulation).[24][25][26][27][28]

In April 2019, the UK Information Commissioner's Office (ICO) issued a proposed design code for the operations of social networking services when used by minors, which prohibits using "nudges" to draw users into options that have low privacy settings. This code would be enforceable under the GDPR.[29]

On 9 April 2019, US senators Deb Fischer and Mark Warner introduced the Deceptive Experiences To Online Users Reduction (DETOUR) Act, which would make it illegal for companies with more than 100 million monthly active users to use dark patterns when seeking consent to use their personal information.[30]

In March 2021, California adopted amendments to the California Consumer Privacy Act, which prohibits the use of deceptive user interfaces that have "the substantial effect of subverting or impairing a consumer's choice to opt-out".[16]

See alsoEdit

ReferencesEdit

  1. ^ Campbell-Dollaghan, Kelsey (21 December 2016). "The Year Dark Patterns Won". CO.DESIGN. Retrieved 29 May 2017.
  2. ^ Singer, Natasha (14 May 2016). "When Websites Won't Take No For An Answer". The New York Times. Retrieved 29 May 2017.
  3. ^ Nield, David (4 April 2017). "Dark Patterns: The Ways Websites Trick Us Into Giving Up Our Privacy". Gizmodo. Retrieved 30 May 2017.
  4. ^ Brignull, Harry (1 November 2011). "Dark Patterns: Deception vs. Honesty in UI Design". A List Apart. Retrieved 29 May 2017.
  5. ^ Grauer, Yael (28 July 2016). "Dark Patterns Are Designed to Trick You, and They're All Over the Web". Ars Technica. Retrieved 29 May 2017.
  6. ^ Fussell, Sidney, The Endless, Invisible Persuasion Tactics of the Internet, The Atlantic, 2 August 2019
  7. ^ Gray, Colin M.; Kou, Yubo; Battles, Bryan; Hoggatt, Joseph; Toombs, Austin L. (2018). "The Dark (Patterns) Side of UX Design". Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI '18. New York, New York, USA: ACM Press: 1–14. doi:10.1145/3173574.3174108. ISBN 9781450356206. S2CID 5082752.
  8. ^ Release, Press (19 May 2021). "Coalition Launches 'Dark Patterns' Tip Line to Expose Deceptive Technology Design". Electronic Frontier Foundation. Retrieved 27 May 2021.
  9. ^ a b Snyder, Jesse (10 September 2012). "Dark Patterns in UI and Website Design". evatotuts+. Retrieved 29 May 2017.
  10. ^ a b c Brignull, Harry. "Types of Dark Patterns". Dark Patterns. Retrieved 29 May 2017.
  11. ^ "UX Dark Patterns: Manipulinks and Confirmshaming". UX Booth. Retrieved 2 November 2019.
  12. ^ "Terms of service for McAffee in μTorrent installer". 2017. Retrieved 13 October 2018.
  13. ^ Brinkmann, Martin (17 July 2013). "SourceForge's new Installer bundles program downloads with adware". Retrieved 13 October 2018. ... The offer is displayed on the screen, and below that a gray decline button, a green accept button ...
  14. ^ "Why do we need email addresses to create Reddit accounts now?". 2017. Retrieved 13 October 2018. ... you can skip it by leaving it blank.
  15. ^ Schlosser, Dan (5 June 2016). "LinkedIn Dark Patterns". Retrieved 13 October 2018. ... you need to find the tiny "Skip this step" link at the bottom right to proceed. Moreover, the link is placed outside of the blue box which ostensibly contains all relevant info or controls. ...
  16. ^ a b Vincent, James (16 March 2021). "California bans 'dark patterns' that trick users into giving away their personal data". The Verge. Retrieved 21 March 2021.
  17. ^ Brignull, Harry (29 August 2013). "Dark patterns: Inside the interfaces designed to trick you". The Verge. Retrieved 29 May 2017.
  18. ^ Bösch, Christoph; Erb, Benjamin; Kargl, Frank; Kopp, Henning; Pfattheicher, Stefan (1 October 2016). "Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns". Proceedings on Privacy Enhancing Technologies. 2016 (4): 237–254. doi:10.1515/popets-2016-0038. ISSN 2299-0984.
  19. ^ Fritsch, Lothar (2017). Privacy dark patterns in identity management. Gesellschaft für Informatik, Bonn. ISBN 978-3-88579-671-8.
  20. ^ Moen, Gro Mette, Ailo Krogh Ravna, and Finn Myrstad: Deceived by Design - How tech companies use dark patterns to discourage us from exercising our rights to privacy., 2018, Consumer council of Norway / Forbrukerrådet. Report.
  21. ^ Mathur, Arunesh; Acar, Gunes; Friedman, Michael J.; Lucherini, Elena; Mayer, Jonathan; Chetty, Marshini; Narayanan, Arvind (November 2019). "Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites". Proc. ACM Hum.-Comput. Interact. 3 (CSCW): 81:1–81:32. arXiv:1907.07032. Bibcode:2019arXiv190707032M. doi:10.1145/3359183. ISSN 2573-0142. S2CID 196831872.
  22. ^ Human, Soheil; Cech, Florian (2021). Zimmermann, Alfred; Howlett, Robert J.; Jain, Lakhmi C. (eds.). "A Human-Centric Perspective on Digital Consenting: The Case of GAFAM". Human Centred Intelligent Systems. Smart Innovation, Systems and Technologies. Singapore: Springer. 189: 139–159. doi:10.1007/978-981-15-5784-2_12. ISBN 978-981-15-5784-2. S2CID 214699040.
  23. ^ Title 16 of the Code of Federal Regulations § 238
  24. ^ "Understanding 'trust' and 'consent' are the real keys to embracing GDPR". The Drum. Retrieved 10 April 2019.
  25. ^ "Facebook and Google hit with $8.8 billion in lawsuits on day one of GDPR". The Verge. Archived from the original on 25 May 2018. Retrieved 26 May 2018.
  26. ^ "Max Schrems files first cases under GDPR against Facebook and Google". The Irish Times. Archived from the original on 25 May 2018. Retrieved 26 May 2018.
  27. ^ "Facebook, Google face first GDPR complaints over 'forced consent'". TechCrunch. Archived from the original on 26 May 2018. Retrieved 26 May 2018.
  28. ^ Meyer, David. "Google, Facebook hit with serious GDPR complaints: Others will be soon". ZDNet. Archived from the original on 28 May 2018. Retrieved 26 May 2018.
  29. ^ "Under-18s face 'like' and 'streaks' limits". BBC News. 15 April 2019. Retrieved 15 April 2019.
  30. ^ Kelly, Makena (9 April 2019). "Big Tech's 'dark patterns' could be outlawed under new Senate bill". The Verge. Retrieved 10 April 2019.

External linksEdit