Trabalho apresentado no XX Simpósio Brasileiro sobre Fatores Humanos em Sistemas Computacionais (IHC 2021).
Em alguns websites de compras, como os de empresas aéreas, quando o cliente já está no fim do processo de compra de um certo produto ou serviço, restando apenas a confirmação para a cobrança, o site adiciona coloca automaticamente no carrinho de compras algum produto ou serviço extra (e.g., , como garantia estendida ou seguro adicional), não solicitado pelo usuário, e que elevando o valor final do produto. Caso o usuário não perceba esse serviço ou produto adicional, se o cliente não perceber irá acabará pagando mais caro do que esperava. Esse tipo de prática antiética tem ocorrido há décadas fora e dentro da internet, e tem ganhado cada vez mais atenção de pesquisadores na Ciência da Computação e principalmente de Interação Humano-Computador (IHC) [35].
Apesar de estarem sendo fortemente explorados principalmente na internet, os deceptive patterns já existem há muito tempo. Um exemplo notório são as empresas que oferecem períodos de teste grátis de seus produtos em troca de um cadastro do cliente (e.g., a assinatura de uma revista), mas ao se cadastrar a pessoa é obrigada a entregar suas informações do cartão de crédito e, chegado o término do período gratuito, a empresa não avisa o cliente e cobra automaticamente do cartão a assinatura do produto mesmo sem o cliente desejar.
Deceptive patterns também existem no mundo físico. A Figura 2 ilustra um exemplo de um banco de praça que não permite que uma pessoa se deite ou que duas pessoas se sentem juntas porque uma barra de metal foi colocada no meio do banco. O projeto desse banco é um deceptive pattern pois favorece os interesses de terceiros (e.g., a Prefeitura) que não querem pessoas deitadas nos bancos das praças, ao custo dos interesses de cidadãos que queiram se deitar ou sentar junto de outra pessoa. É um exemplo de design opressivo [27], também conhecido como design desagradável ou hostil [28,29], pois pré-determina quem poderá usar o banco e como. O design hostil tem sido relacionado com a Aporofobia, ou o ódio aos pobres 56 . Uma pessoa cansada procurando um lugar para descansar, um casal num encontro, ou uma mãe que quer sentar junto de seus filhos, serão forçados a se comportar de forma diferente do que esperavam ou desejavam.
Figura 2. Banco de praça com barra metálica instalada, exemplo de deceptive pattern físico.
Portanto, podemos entender deceptive patterns como um fenômeno sociotécnico : são aplicados propositalmente em artefatos técnicos, digitais ou físicos, para beneficiar uma terceira parte em detrimento de quem utiliza os artefatos. Para se manifestarem, deceptive patterns dependem tanto de aspectos técnicos que os operacionalizam e dão forma em um produto ou serviço, quanto de aspectos sociais que permeiam o contexto situado em que seu uso ocorre e tem sentido.
Fonte: Imagem de Freepik
[1] Alexander, C. (1979). The Timeless Way of Building. New York: Oxford University
Press.
[2] BECK, Kent; CUNNINGHAM, Ward. Using pattern languages for object oriented
programs. In OOP-SLA-87 Workshop on Specification and Design for Object-Oriented
Programming. OOPSLA, 1987.
[3] KOENIG, Andrew. Patterns and antipatterns. The patterns handbook: techniques,
strategies, and applications, v. 13, p. 383, 1998.
[4] HARRY BRIGNULL. Dark Patterns, 2010. Página inicial. Disponível em:
<https://www.darkpatterns.org/>. Acesso em: 10 de jan. de 2021.
[5] CONTI, Gregory; SOBIESK, Edward. Malicious interface design: exploiting the user. In:
Proceedings of the 19th international conference on World wide web. 2010. p. 271-280.
[6] GRAY, Colin M. et al. The dark (patterns) side of UX design. In: Proceedings of the 2018
CHI Conference on Human Factors in Computing Systems. 2018. p. 1-14.
[7] MATHUR, Arunesh et al. Dark patterns at scale: Findings from a crawl of 11K shopping
websites. Proceedings of the ACM on Human-Computer Interaction, v. 3, n. CSCW, p.
1-32, 2019.
[8] LACEY, Cherie; CAUDWELL, Catherine. Cuteness as a ‘Dark Pattern’ in home robots. In:
2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
IEEE, 2019. p. 374-381.
[9] GREENBERG, Saul et al. Dark patterns in proxemic interactions: a critical perspective.
In: Proceedings of the 2014 conference on Designing interactive systems. 2014. p.
523-532.
[10] ZAGAL, José P.; BJÖRK, Staffan; LEWIS, Chris. Dark patterns in the design of games.
In: Foundations of Digital Games 2013. 2013.
[11] M. BHOOT, Aditi; A. SHINDE, Mayuri; P. MISHRA, Wricha. Towards the Identification of
Dark Patterns: An Analysis Based on End-User Reactions. In: IndiaHCI'20: Proceedings of
the 11th Indian Conference on Human-Computer Interaction. 2020. p. 24-33.
[12] DI GERONIMO, Linda et al. UI dark patterns and where to find them: a study on mobile
applications and user perception. In: Proceedings of the 2020 CHI Conference on Human
Factors in Computing Systems. 2020. p. 1-14.
[13] KITCHENHAM, Barbara; CHARTERS, Stuart. Guidelines for performing systematic
literature reviews in software engineering. 2007.
[14] BASILI, Victor R.; ROMBACH, H. Dieter. The TAME project: Towards
improvement-oriented software environments. IEEE Transactions on software
engineering, v. 14, n. 6, p. 758-773, 1988.
[15] BÖSCH, Christoph et al. Tales from the dark side: Privacy dark strategies and privacy
dark patterns. Proceedings on Privacy Enhancing Technologies, v. 2016, n. 4, p.
237-254, 2016
[16] KITCHENHAM, Barbara; CHARTERS, Stuart. Guidelines for performing systematic
literature reviews in software engineering. 2007.
[17] NOUWENS, Midas et al. Dark patterns after the GDPR: Scraping consent pop-ups and
demonstrating their influence. In: Proceedings of the 2020 CHI Conference on Human
Factors in Computing Systems. 2020. p. 1-13.
[18] TRICE, Michael; POTTS, Liza. Building dark patterns into platforms: How GamerGate
perturbed Twitter’s user experience. Present Tense: A Journal of Rhetoric in Society, v.
6, n. 3, 2018.
[19] CHROMIK, Michael et al. Dark Patterns of Explainability, Transparency, and User
Control for Intelligent Systems. In: IUI Workshops. 2019.
[20] WALDMAN, Ari Ezra. Cognitive biases, dark patterns, and the ‘privacy paradox’.
Current opinion in psychology, v. 31, p. 105-109, 2020.
[21] LUGURI, Jamie; STRAHILEVITZ, Lior. Shining a light on dark patterns. U of Chicago,
Public Law Working Paper, n. 719, 2019.
[22] CHIVUKULA, Shruthi Sai et al. " Nothing Comes Before Profit" Asshole Design In the
Wild. In: Extended Abstracts of the 2019 CHI Conference on Human Factors in
Computing Systems. 2019. p. 1-6.
[23] LEITÃO, Carla et al. Human Values in HCI: a challenge for the GrandIHC-BR. In:
Proceedings of the XVI Brazilian Symposium on Human Factors in Computing
Systems. 2017. p. 1-6.
[24] MAIER, Maximilian; HARR, Rikard. DARK DESIGN PATTERNS: AN END-USER
PERSPECTIVE. Human Technology, v. 16, n. 2, 2020.
[25] GRAY, Colin M. et al. End User Accounts of Dark Patterns as Felt Manipulation.
Proceedings of the ACM on Human-Computer Interaction, v. 5, n. CSCW2, p. 1-25,
2021.
[26] UTZ, Christine et al. (un) informed consent: Studying gdpr consent notices in the field.
In: Proceedings of the 2019 acm sigsac conference on computer and communications
security. 2019. p. 973-990.
[27] GONZATTO, Rodrigo Freese; VAN AMSTEL, Frederick MC. Designing oppressive and
libertarian interactions with the conscious body. In: Proceedings of the XVI Brazilian
Symposium on Human Factors in Computing Systems. 2017. p. 1-10.
[28] SAVIC, Selena; SAVICIC, Gordan. Unpleasant design. Designing out unwanted
behaviour. In: Proceedings of the 5th STS Italia Conference: A Matter of Design.
Making Society through Science and Technology. 2014. p. 975-988.
[29] ROSENBERGER, Robert. On hostile design: Theoretical and empirical prospects.
Urban Studies, v. 57, n. 4, p. 883-893, 2020.
[30] STAMPER, Ronald. A semiotic theory of information and information systems. In:
Invited papers for the ICL/University of Newcastle Seminar on Information. 1993.
[31] LANDIS, J. Richard; KOCH, Gary G. The measurement of observer agreement for
categorical data. biometrics, p. 159-174, 1977.
[32] ACQUISTI, Alessandro et al. Nudges for privacy and security: Understanding and
assisting users’ choices online. ACM Computing Surveys (CSUR), v. 50, n. 3, p. 1-41,
2017.
[33] TVERSKY, Amos; KAHNEMAN, Daniel. The framing of decisions and the psychology of
choice. science, v. 211, n. 4481, p. 453-458, 1981.
[34] boyd, danah. 2008. "Understanding Socio-Technical Phenomena in a Web2.0 Era."
MSR New England Lab Opening, Cambridge MA, September 22.
[35] MATHUR, Arunesh; KSHIRSAGAR, Mihir; MAYER, Jonathan. What makes a dark
pattern... dark? design attributes, normative considerations, and measurement methods. In:
Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems.
2021. p. 1-18.
[36] BARONI, Luiz Adolpho et al. Dark Patterns: Towards a Socio-technical Approach. In:
Proceedings of the XX Brazilian Symposium on Human Factors in Computing
Systems. 2021. p. 1-7.
[37] FOGG, Brian J. Persuasive technologie301398. Communications of the ACM, v. 42, n.
5, p. 26-29, 1999.
[38] GRAY, Colin M.; CHIVUKULA, Shruthi Sai. Ethical mediation in UX practice. In:
Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems.
2019. p. 1-11.
[39] CHIVUKULA, Shruthi Sai; GRAY, Colin M.; BRIER, Jason A. Analyzing value discovery
in design decisions through ethicography. In: Proceedings of the 2019 CHI Conference on
Human Factors in Computing Systems. 2019. p. 1-12.
[40] BORNING, Alan et al. SurveillanceCapitalism@ CHI: Civil Conversation around a
Difficult Topic. In: Extended Abstracts of the 2020 CHI Conference on Human Factors in
Computing Systems. 2020. p. 1-6.
[41] NYSTRÖM, Tobias; STIBE, Agnis. When Persuasive Technology Gets Dark?. In:
European, Mediterranean, and Middle Eastern Conference on Information Systems.
Springer, Cham, 2020. p. 331-345.
[42] MAIKE, Vanessa RML; BUCHDID, Samuel B.; BARANAUSKAS, M. Cecília C.
Designing natural user interfaces scenarios for all and for some: an analysis informed by organizational semiotics artifacts. In: International Conference on Informatics and
Semiotics in Organisations. Springer, Cham, 2015. p. 92-101.
[43] FERRARI, Bernardo et al. Design socialmente consciente de jogos: relato de uma
oficina prática para o entendimento do problema e prospecção de ideias. In: Anais do I
Workshop sobre Interação e Pesquisa de Usuários no Desenvolvimento de Jogos.
SBC, 2019. p. 11-20.
[44] MÄKELÄ, Ville et al. Acceptance and perceptions of interactive location-tracking
displays. In: Proceedings of the 8th ACM International Symposium on Pervasive
Displays. 2019. p. 1-7.
[45] FITTON, Dan; READ, Janet C. Creating a framework to support the critical
consideration of dark design aspects in free-to-play apps. In:Proceedings of the 18th ACM
International Conference on Interaction Design and Children. 2019. p. 407-418. 2019. p.
407-418.
[46] CHANG, Daphne et al. Engineering information disclosure: Norm shaping designs. In:
Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems.
2016. p. 587-597.
[47] ILANY TZUR, Naama; ZALMANSON, Lior; OESTREICHER-SINGER, Gal. The Dark
Side of User Participation-The Effect of Calls to Action on Trust and Information Revelation.
Available at SSRN 2814903, 2016.
[48] MOSER, Carol; SCHOENEBECK, Sarita Y.; RESNICK, Paul. Impulse buying: Design
practices and consumer needs. In: Proceedings of the 2019 CHI Conference on Human
Factors in Computing Systems. 2019. p. 1-15.
[49] AYDEMIR, Fatma Basak; DALPIAZ, Fabiano. A roadmap for ethics-aware software
engineering. In: 2018 IEEE/ACM International Workshop on Software Fairness
(FairWare). IEEE, 2018. p. 15-21.
[50] MONAHAN, Torin. Built to lie: Investigating technologies of deception, surveillance, and
control. The Information Society, v. 32, n. 4, p. 229-240, 2016.
[51] DAVIS, Janet. Design methods for ethical persuasive computing. In: Proceedings of
the 4th International Conference on Persuasive Technology. 2009. p. 1-8.
[52] NORVAL, Chris et al. Reclaiming data: Overcoming app identification barriers for
exercising data protection rights. In: Proceedings of the 2018 ACM International Joint
Conference and 2018 International Symposium on Pervasive and Ubiquitous
Computing and Wearable Computers. 2018. p. 921-930.
[53] LANDWEHR, Marvin; BORNING, Alan; WULF, Volker. The High Cost of Free Services:
Problems with Surveillance Capitalism and Possible Alternatives for IT Infrastructure. In:
Proceedings of the Fifth Workshop on Computing within Limits. 2019. p. 1-10.
[54] FANSHER, Madison; CHIVUKULA, Shruthi Sai; GRAY, Colin M. # darkpatterns: UX
Practitioner Conversations About Ethical Design. In: Extended Abstracts of the 2018 CHI
Conference on Human Factors in Computing Systems. 2018. p. 1-6.
[55] SANCHEZ-ROLA, Iskander et al. Can i opt out yet? gdpr and the global illusion of
cookie control. In: Proceedings of the 2019 ACM Asia conference on computer and
communications security. 2019. p. 340-351.
[56] WESTIN, Fiona; CHIASSON, Sonia. Opt out of privacy or" go home" understanding
reluctant privacy behaviours through the FoMO-centric design paradigm. In: Proceedings of
the New Security Paradigms Workshop. 2019. p. 57-67.
[57] ROGERS, Yvonne et al. The Dark Side of Interaction Design. In: Extended Abstracts
of the 2020 CHI Conference on Human Factors in Computing Systems. 2020. p. 1-4.
[58] PAAY, Jeni; ROGERS, Yvonne. The Dark Side of Interaction Design: Nudges, Dark
Patterns and Digital Addiction: Panel Presented at OZCHI 2019. In: Proceedings of the
31st Australian Conference on Human-Computer-Interaction. 2019. p. 2-2.
[59] GRAY, Colin M.; CHIVUKULA, Shruthi Sai; LEE, Ahreum. What Kind of Work Do"
Asshole Designers" Create? Describing Properties of Ethical Concern on Reddit. In:
Proceedings of the 2020 ACM Designing Interactive Systems Conference. 2020. p.
61-73.
[60] NARAYANAN, Arvind et al. Dark Patterns: Past, Present, and Future: The evolution of
tricky user interfaces. Queue, v. 18, n. 2, p. 67-92, 2020.
[61] WIDDICKS, Kelly; PARGMAN, Daniel; BJORK, Staffan. Backfiring and favouring: how
design processes in HCI lead to anti-patterns and repentant designers. In: Proceedings of
the 11th Nordic Conference on Human-Computer Interaction: Shaping Experiences,
Shaping Society. 2020. p. 1-12.
[62] WIDDICKS, Kelly. When the Good Turns Ugly: Speculating Next Steps for Digital
Wellbeing Tools. In: Proceedings of the 11th Nordic Conference on Human-Computer
Interaction: Shaping Experiences, Shaping Society. 2020. p. 1-6.
[63] PINDER, Charlie. The anti-influence engine: Escaping the diabolical machine of
pervasive advertising. In: Proceedings of the 2017 CHI Conference Extended Abstracts
on Human Factors in Computing Systems. 2017. p. 770-781.
[64] CONTI, Gregory; SOBIESK, Edward. Malicious interfaces and personalization's
uninviting future. IEEE Security & Privacy, v. 7, n. 3, p. 64-67, 2009.
[65] CORNELIUS, Kristin B. Zombie contracts, dark patterns of design, and
'documentisation'. Internet Policy Review, v. 8, n. 2, p. 1-25, 2019.
[66] ADAR, Eytan; TAN, Desney S.; TEEVAN, Jaime. Benevolent deception in human
computer interaction. In: Proceedings of the SIGCHI conference on human factors in
computing systems. 2013. p. 1863-1872.
[67] SOE, Than Htut et al. Circumvention by design-dark patterns in cookie consent for
online news outlets. In: Proceedings of the 11th Nordic Conference on
Human-Computer Interaction: Shaping Experiences, Shaping Society. 2020. p. 1-12.
[68] COUNCIL, Norwegian Consumer. Deceived by design, how tech companies use dark
patterns to discourage us from exercising our rights to privacy. Norwegian Consumer
Council Report, 2018.
[69] DAVIS, Fred D. Perceived usefulness, perceived ease of use, and user acceptance of
information technology. MIS quarterly, p. 319-340, 1989.
[70] ZUBOFF, Shoshana. The age of surveillance capitalism: The fight for a human
future at the new frontier of power: Barack Obama's books of 2019. Profile books, 2019.
[71] KAHNEMAN, Daniel. A perspective on judgment and choice: mapping bounded
rationality. American psychologist, v. 58, n. 9, p. 697, 2003.
[72] DECI, Edward L.; RYAN, Richard M. Intrinsic motivation and self-determination in
human behavior. Springer Science & Business Media, 2013.
[73] PRZYBYLSKI, Andrew K. et al. Motivational, emotional, and behavioral correlates of
fear of missing out. Computers in human behavior, v. 29, n. 4, p. 1841-1848, 2013.
[74] SALES, H. B. Standard Form Contracts. Mod. L. Rev., v. 16, p. 318, 1953.
[75] KESSLER, Friedrich. Contracts of adhesion--some thoughts about freedom of contract.
Columbia Law Review, v. 43, n. 5, p. 629-642, 1943.
[76] BURKE, John JA. Contract as commodity: a non-fiction statutory approach. Statute L.
Rev., v. 21, p. 12, 2000.
[77] General Data Protection Regulation - GDPR. Página inicial. Disponível em:
<https://gdpr-info.eu/>. Acesso em: 20 de mar. de 2022.
[78] FRIEDMAN, Batya; HENDRY, David G. Value sensitive design: Shaping technology
with moral imagination. Mit Press, 2019.
[79] FRIEDMAN, Batya; KAHN JR, Peter H.; BORNING, Alan. Value Sensitive Design and
Information Systems. The Handbook of Information and Computer Ethics, p. 69-101,
2008.
[80] FLANAGAN, Mary; NISSENBAUM, Helen. Values at play in digital games. MIT Press,
2014.
[81] STRONG, Edward Kellogg. The psychology of selling and advertising. McGraw-Hill
book Company, Incorporated, 1925.
[82] LANGER, Ellen J. The illusion of control. Journal of personality and social
psychology, v. 32, n. 2, p. 311, 1975.
[83] SCHEIBER, Noam. How Uber uses psychological tricks to push its drivers’ buttons. The
New York Times, v. 2, 2017. Disponível em:
<https://www.nytimes.com/interactive/2017/04/02/technology/uber-drivers-psychological-trick
s.html>. Acesso em: 20 de jan. de 2022.
[84] BIZER, George Y.; SCHINDLER, Robert M. Direct evidence of ending‐digit drop‐off in
price information processing. Psychology & Marketing, v. 22, n. 10, p. 771-783, 2005.
created with
HTML Website Builder .