Gender Bender Bot - The Effect of (Not) Following Gender Stereotypes in Conversational Agent Design

Authors

Fabian Hildebrandt
Dresden University of Technology

Keywords:

Conversational Agents, Anthropomorphism, Avatar, Name, Gender, Stereotypes

Synopsis

This is a Chapter in:

Book:
Smart and Sustainable Applications

Print ISBN 978-1-6692-0006-2
Online ISBN 978-1-6692-0005-5

Series:
Chronicle of Computing

Chapter Abstract:
Conversational agents (CAs), are increasingly becoming a common presence in our daily lives (e.g., Alexa or ChatGPT). Research has shown that designing CAs humanlike (e.g., through social cues such as a human name or avatar) results in a higher perception of humanness, which increases and service satisfaction by the user. In this context, CAs are exclusively designed to portray stereotypical genders (e.g., combining a female name and avatar). To challenge this quasi-standard, a 2x2 experiment (male/ female avatar x male/ female name) with 262 participants was conducted to investigate the effect of gender-mixed CAs. Our results indicate that users of CAs with a stereotypical gender report higher service satisfaction and a partially higher perception of social presence for male CAs.
However, the results do not reveal any differences in perceived empathy and competence. Thus, it appears that users prefer stereotypically CAs, which is in sync with current practice.

Keywords:
Conversational Agents, Anthropomorphism, Avatar, Name, Gender, Stereotypes

Cite this paper as:
Hildebrandt F. (2024) Gender Bender Bot - The Effect of (Not) Following Gender Stereotypes in Conversational Agent Design. In: Tiako P.F. (ed) Smart and Sustainable Applications. Chronicle of Computing. OkIP. https://doi.org/10.55432/978-1-6692-0005-5_12

Presented at:
The 2023 OkIP International Conference on Automated and Intelligent Systems (CAIS) in Oklahoma City, Oklahoma, USA, and Online, on October 2-5, 2023

Contact:
Fabian Hildebrandt
fabian.hildebrandt@tu-dresden.de

References

Agarwal, M. (2022). A Facebook Messenger Chatbots Guide to Manage Customer Interaction. https://www.socialpilot.co/facebook-marketing/facebook-messenger-chatbots

Barrett, M., Davidson, E., Prabhu, J., & Vargo, S. L. (2015). Service innovation in the digital age: Key contributions and future directions. MIS Quarterly, 39(1), 135–154.

Bastiansen, M. H. A., Kroon, A. C., & Araujo, T. (2022). Female chatbots are helpful, male chatbots are competent? Publizistik, 67, 601–623.

Brahnam, S., & De Angeli, A. (2012). Gender affordances of conversational agents. Interacting with Computers, 24(3), 139–153.

Bührke, J., Brendel, A. B., Lichtenberg, S., Greve, M., & Mirbabaie, M. (2021). Is Making Mistakes Human? On the Perception of Typing Errors in Chatbot Communication. Proceedings of the 54th Hawaii International Conference on System Sciences (HICSS).

Cafaro, A., Vilhjalmsson, H. H., & Bickmore, T. (2016). First impressions in human-agent virtual encounters. ACM Transactions on Computer-Human Interaction, 24(4), 1–40.

Compeau, D., Marcolin, B., Kelley, H., & Higgins, C. (2012). Generalizability of information systems research using student subjects A reflection on our practices and recommendations for future research. Information Systems Research, 23(4), 1093–1109.

Costa, P., & Ribas, L. (2019). AI becomes her: Discussing gender and artificial intelligence. Technoetic Arts, 17(1–2), 171–193.

Cowell, A. J., & Stanney, K. M. (2005). Manipulation of non-verbal interaction style and demographic embodiment to increase anthropomorphic computer character credibility. International Journal of Human Computer Studies, 62(2), 281–306.

de Visser, E. J., Monfort, S. S., McKendrick, R., Smith, M. A. B., McKnight, P. E., Krueger, F., & Parasuraman, R. (2016). Almost human: Anthropomorphism increases trust resilience in cognitive agents. Journal of Experimental Psychology: Applied, 22(3), 331–349.

Diederich, S., Brendel, A. B., & Kolbe, L. (2019). Towards a Taxonomy of Platforms for Conversational Agent Design. Proceedings of the 14th International Conference on Wirtschaftsinformatik (WI), 1100–1114.

Diederich, S., Brendel, A. B., & Kolbe, L. M. (2020). Designing Anthropomorphic Enterprise Conversational Agents. Business and Information Systems Engineering, 62, 193–209.

Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On Seeing Human: A Three-Factor Theory of Anthropomorphism. Psychological Review, 114(4), 864–886.

Feine, J., Gnewuch, U., Morana, S., & Maedche, A. (2019). A Taxonomy of Social Cues for Conversational Agents. International Journal of Human Computer Studies, 132(12), 138–161.

Feine, J., Gnewuch, U., Morana, S., & Maedche, A. (2020). Gender Bias in Chatbot Design. Chatbot Research and Design, 79–93.

Feine, J., Morana, S., & Gnewuch, U. (2019). Measuring Service Encounter Satisfaction with Customer Service Chatbots using Sentiment Analysis. Proceedings of the 14th International Conference on Wirtschaftsinformatik (WI), 1–11.

Følstad, A., & Brandtzaeg, P. B. (2017). Chatbots and the New World of HCI. Interactions, 24(4), 38–42.

Gabbott, M., & Hogg, G. (2001). The Role of Non-verbal Communication in Service Encounters: A Conceptual Framework. Journal of Marketing Management.

Gefen, D., & Straub, D. (2005). A Practical Guide To Factorial Validity Using PLS-Graph: Tutorial And Annotated Example. Communications of the Association for Information Systems, 16(1), 91–109.

Gefen, D., & Straub, D. W. (1997). Gender differences in the perception and use of e-mail: An extension to the technology acceptance model. MIS Quarterly, 21(4), 389–400.

Gnewuch, U., Morana, S., Adam, M. T. P., & Maedche, A. (2018). Faster Is Not Always Better: Understanding the Effect of Dynamic Response Delays in Human-Chatbot Interaction. Proceedings of the 26th European Conference on Information Systems (ECIS), 1–17.

Gong, L. (2008). How social is social responses to computers? The function of the degree of anthropomorphism in computer representations. Computers in Human Behavior, 24(4), 1494–1509.

Hart, P. M., Jones, S. R., & Royne, M. B. (2013). The human lens: How anthropomorphic reasoning varies by product complexity and enhances personal value. Journal of Marketing Management, 29(1–2), 105–121.

Hughes, L., Gauld, R., Grover, V., Hu, M., Edwards, J. S., Flavi, C., Janssen, M., Jones, P., Junglas, I., Khorana, S., & Kraus, S. (2023). “So What if ChatGPT Wrote It ?” Multidisciplinary Perspectives on Opportunities, Challenges and Implications of Generative Conversational AI for Research, Practice and Policy. International Journal of Information Management, 71, 102642.

Kanahara, S. (2006). A review of the definitions of stereotype and a proposal for a progressional model. Individual Differences Research, 4(5), 306–321.

Khan, R. (2017). Standardized Architecture for Conversational Agents a.k.a. ChatBots. International Journal of Computer Trends and Technology, 50(2), 114–121.

Lang, H., Seufert, T., Klepsch, M., Minker, W., & Nothdurft, F. (2013). Are Computers Still Social Actors? Conference on Human Factors in Computing Systems - Proceedings, 859–864.

Leite, I., Pereira, A., Mascarenhas, S., Martinho, C., Prada, R., & Paiva, A. (2013). The influence of empathy in human-robot relations. International Journal of Human Computer Studies.

Lester, J., Branting, K., & Mott, B. (2004). Conversational agents. In The Practical Handbook of Internet Computing (pp. 1–17).

Leviathan, Y., & Matias, Y. (2018). Google Duplex: An AI System for Accomplishing Real-World Tasks Over the Phone. https://ai.googleblog.com/2018/05/duplex-ai-system-for-natural-conversation.html

Maaleki, A. (2018). The ARZESH Competency Model: Appraisal & Development Manager’s Competency Model. LAP LAMBERT Academic Publishing.

Marecek, J. (1995). Gender, politics, and psychology’s ways of knowing. American Psychologist, 50(3), 162–163.

Marinova, D., de Ruyter, K., Huang, M. H., Meuter, M. L., & Challagalla, G. (2017). Getting Smart: Learning From Technology-Empowered Frontline Interactions. Journal of Service Research, 20(1), 29–42.

McDonnell, M., & Baxter, D. (2019). Chatbots and Gender Stereotyping. Interacting with Computers, 31(2), 116–121.

McQuiggan, S. W., & Lester, J. C. (2007). Modeling and evaluating empathy in embodied companion agents. International Journal of Human Computer Studies, 65(4), 348–360.

McTear, M. F. (2017). The rise of the conversational interface: A new kid on the block? International Workshop on Future and Emerging Trends in Language Technology, 38–49.

McTear, M. F., Callejas, Z., & Griol, D. (2016). Conversational Interfaces: Past and Present. In The Conversational Interface (pp. 51–72). Springer.

Millán, Á., & Esteban, Á. (2004). Development of a multiple-item scale for measuring customer satisfaction in travel agencies services. Tourism Management, 25(5), 533–546.

Morgenroth, T., & Ryan, M. K. (2018). Gender Trouble in Social Psychology: How Can Butler’s Work Inform Experimental Social Psychologists’ Conceptualization of Gender? Frontiers in Psychology, 9, 1320.

Mozafari, N., Schwede, M., Hammerschmidt, M., & Weiger, W. H. (2022). Claim success, but blame the bot? User reactions to service failure and recovery in interactions with humanoid service robots. Proceedings of the 55th Hawaii International Conference on System Sciences (HICSS).

Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues: A Journal of the Society for the Psychological Studies of Social Issues, 56(1), 81–103.

Nass, C., Moon, Y., & Green, N. (1997). Are machines gender neutral? Gender-stereotypic responses to computers with voices. Journal of Applied Social Psychology, 27(10), 864–876.

Nass, C., Steuer, J., & Tauber, E. R. (1994). Computers are social actors. Proceedings of the ACM CHI Conference on Human Factors in Computing Systems, 72–78.

Nowak, K. L., & Fox, J. (2018). Avatars and computer-mediated communication: A review of the definitions, uses, and effects of digital representations. Review of Communication Research, 6, 30–53.

Nunamaker, J., Derrick, D., Elkins, A., Burgoon, J., & Patton, M. (2011). Embodied conversational agent-based kiosk for automated interviewing. Journal of Management Information Systems, 28(1), 17–48.

Oliver, R. L. (1980). A Cognitive Model of the Antecedents and Consequences of Satisfaction Decisions. Journal of Marketing Research, 17(4), 460–469.

Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1985). A Conceptual Model of Service Quality and Its Implications for Future Research. Journal of Marketing, 49(4), 41–50.

Pfeuffer, N., Adam, M., Toutaoui, J., Hinz, O., & Benlian, A. (2019). Mr. And Mrs. Conversational agent - Gender stereotyping in judge-advisor systems and the role of egocentric bias. Proceedings of the 40th International Conference on Information Systems (ICIS).

Riquel, J., Brendel, A. B., Hildebrandt, F., Greve, M., & Dennis, A. R. (2021). “F*** You !” – An Investigation of Humanness, Frustration, and Aggression in Conversational Agent Communication. Proceedings of the 42nd International Conference on Information Systems (ICIS), 1–16.

Schmid, D., Staehelin, D., Bucher, A., Dolata, M., & Schwabe, G. (2022). Does Social Presence Increase Perceived Competence? Proceedings of the ACM on Human-Computer Interaction.

Schuetzler, R. M., Giboney, J. S., Grimes, G. M., & Nunamaker, J. F. (2018). The Influence of Conversational Agents on Socially Desirable Responding. Proceedings of the 51st Hawaii International Conference on System Sciences (HICSS).

Seeger, A.-M., Pfeiffer, J., & Heinzl, A. (2021). Texting with Human-like Conversational Agents: Designing for Anthropomorphism. Journal of the Association for Information Systems, 22(4), 1–58.

Seeger, A.-M., Pfeiffer, J., & Heinzl, A. (2018). Designing Anthropomorphic Conversational Agents: Development and Empirical Evaluation of a Design Framework. Proceedings of the 39th International Conference on Information Systems (ICIS), 1–17.

Sherman, J. W., Lee, A. Y., Bessenoff, G. R., & Frost, L. A. (1998). Stereotype efficiency reconsidered: Encoding flexibility under cognitive load. Journal of Personality and Social Psychology, 75(3), 589–606.

Short, J., Williams, E., & Christie, B. (1976). The Social Psychology of Telecommunications. Wiley.

Swim, J. K., & Hyers, L. L. (2010). Sexism. In Handbook of prejudice, stereotyping, and discrimination (pp. 407–430). Psychology Press.

Urbach, N., & Ahlemann, F. (2010). Structural Equation Modeling in Information Systems Research Using Partial Least Squares. Journal of Information Technology Theory and Application (JITTA), 11(2), 5–40.

Van Hooijdonk, C., & Liebrecht, C. C. C. (2021). Chatbots in the tourism industry: The effects of communication style and brand familiarity on social presence and brand attitude. Proceedings of UMAP 2021, 375–381.

Verhagen, T., van Nes, J., Feldberg, F., & van Dolen, W. (2014). Virtual customer service agents: Using social presence and personalization to shape online service encounters. Journal of Computer-Mediated Communication, 19(3), 529–545.

Vu, T. L., Tun, K. Z., Eng-Siong, C., & Banchs, R. E. (2021). Online FAQ Chatbot for Customer Support. In Increasing Naturalness and Flexibility in Spoken Dialogue Interaction (pp. 251–259).

Wang, N., Johnson, W. L., Mayer, R. E., Rizzo, P., Shaw, E., & Collins, H. (2008). The politeness effect: Pedagogical agents and learning outcomes. International Journal of Human Computer Studies, 66(2), 98–112.

Wechsung, I., Weiss, B., Kühnel, C., Ehrenbrink, P., & Möller, S. (2013). Development and validation of the conversational agents scale (cas). Proceedings of the Annual Conference of the International Speech Communication Association, Interspeech.

Weizenbaum, J. (1966). ELIZA-A computer program for the study of natural language communication between man and machine. Communications of the ACM, 9(1), 36–45.

West, M., Kraut, R., & Ei Chew, H. (2019). I’d blush if I could : closing gender divides in digital skills through education. UNESCO. https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1

Yan, A., Solomon, S., Mirchandani, D., Lacity, M., & Porra, J. (2013). The role of service agent, service quality, and user satisfaction in self-service technology. International Conference on Interaction Sciences.

Gender Bender Bot

Published

January 27, 2024

Online ISSN

2831-350X

Print ISSN

2831-3496