簡易檢索 / 詳目顯示

研究生: 郭哲伶
Kuo, Che-Ling
論文名稱: The Impact of AI Chatbot Voice Gender on Users' Perceived Value of Emotional Support
The Impact of AI Chatbot Voice Gender on Users' Perceived Value of Emotional Support
指導教授: 林彣珊
Lin, Wen-Shan
學位類別: 碩士
Master
系所名稱: 管理學院 - 國際經營管理研究所
Institute of International Management
論文出版年: 2026
畢業學年度: 114
語文別: 英文
論文頁數: 58
中文關鍵詞: 人工智慧聊天機器人語音性別情緒支持同理心人機互動性別中立語音使用者知覺
外文關鍵詞: AI Chatbot, Voiced-gender, Emotional Support, Empathy, Human-AI Interaction, Gender Neutral Voice, User Perception
相關次數: 點閱:4下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 隨著人工智慧聊天機器人逐漸融入日常生活,使用者不僅將其視為資訊提供工具,也開始期待其在互動中提供情緒支持。本研究旨在探討聊天機器人之語音性別(男性、女性與中性/無性別)如何影響使用者對其情緒支持程度的知覺,並進一步檢驗使用者性別是否在其中扮演調節角色。
    本研究以媒體等同理論(Media Equation Theory)、電腦為社會行動者典範(Computers as Social Actors, CASA)以及刻板印象內容模型(Stereotype Content Model, SCM)為理論基礎,分析不同性別與性別中立的語音線索如何形塑人機互動中的情緒反應。研究採用實驗設計,讓受試者分別與具備不同語音性別的聊天機器人互動,並評估其所感知的情緒支持程度。
    研究結果顯示,聊天機器人語音性別對知覺情緒支持具有顯著影響,其中中性語音條件獲得最高的情緒支持評價;然而,使用者性別並未對此關係產生顯著的調節效果。整體而言,本研究結果指出,性別中立的語音設計可能有助於提升使用者與聊天機器人之間的情緒連結,並為未來發展更具包容性與情緒支持功能的人工智慧系統提供實務與設計上的啟示。

    With the increasing integration of AI chatbots into daily life, users rely on them not only for information but also for emotional support. This study examines how chatbot voice gender (male, female, and unisex) influences users perceived emotional support and whether user gender moderates this relationship. Drawing on Media Equation Theory, the Computers as Social Actors (CASA) paradigm, and the Stereotype Content Model (SCM), this research investigates how gendered and gender-neutral voice cues shape emotional responses in human–AI interaction. An experimental design was used in which participants interacted with chatbots with different voice genders and then evaluated their perceived emotional support. The results revealed a significant main effect of chatbot voice gender. The unisex voice condition received the highest perceived emotional support, while user gender did not significantly moderate this effect. These findings suggest that gender-neutral voice designs may enhance users’ emotional connection with AI chatbots and provide important implications for the development of more inclusive and emotionally supportive AI systems

    ABSTRACT I ACKNOWLEDGEMENTS IV TABLE OF CONTENTS V LIST OF TABLES VIII LIST OF FIGURES IX CHAPTER ONE INTRODUCTION 1 1.1 Research Background. 1 1.2 Research Motivation. 1 1.3 Research Questions. 2 1.4 Research Scope and Limitations. 2 CHAPTER TWO LITERATURE REVIEW 4 2.1 Definition and Theoretical Background. 4 2.2 Gendered Voice and User Perception. 6 2.3 Unisex or Gender-Ambiguous Voices. 8 2.4 Moderating Role of User Gender. 10 2.5 Summary of Research Gaps and Conceptual Framework. 11 CHAPTER THREE RESEARCH DESIGN AND METHODOLOGY 14 3.1 Research Design. 14 3.1.1 Experimental Framework. 14 3.1.2 Experimental Variables. 14 v3.2 Participants. 15 3.2.1 Sampling Method. 15 3.2.2 Demographic Characteristics. 16 3.3 Voice Stimuli Design. 17 3.4 Materials. 17 3.5 Procedure. 18 3.6 Data Analysis. 18 CHAPTER FOUR RESEARCH RESULTS 20 4.1 Overview. 20 4.1.1 Sample Characteristics. 20 4.2 Descriptive Statistics. 21 4.2.1 Manipulation Check Results. 21 4.2.2 Descriptive Statistics of Perceived Emotional Support. 22 4.2.3 Reliability Analysis. 23 4.2.4 Voice Preference Distribution 23 4.3 Effect of Chatbot Voice Gender on Perceived Emotional Support (RQ1 / H1– H2). 24 4.3.1 Main Effect of Voice Gender. 24 4.3.2 Moderating Effect of User Gender (RQ2 / H3). 26 4.4 Summary of Hypothesis Testing. 28 CHAPTER FIVE CONCLUSION AND SUGGESTIONS 30 5.1 Overview. 30 5.2 Theoretical Implications. 31 5.3 Practical Implications. 32 5.4 Discussion. 32 vi5.4.1 Reinterpreting Gendered Voice Effects through Theory. 33 5.4.2 Absence of User Gender Moderation. 34 5.4.3 Ambiguity of Unisex Voice as a Potential Confound. 35 5.5 Limitations and Future Research. 35 5.6 Conclusion. 37 REFERENCE 38 APPENDICES 41

    Abercrombie, G., Cercas Curry, A., Pandya, M., & Rieser, V. (2021). Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants.
    Bastiansen, M. H. A., Kroon, A. C., & Araujo, T. (2022). Female chatbots are helpful, male chatbots are competent? Publizistik, 67(4), 601–623.
    Baxter, D., McDonnell, M., & McLoughlin, R. (2018). Impact of Chatbot Gender on User’s Stereotypical Perception and Satisfaction.
    Bickmore, T., & Picard, R. (2005). Establishing and Maintaining Long-Term Human-Computer Relationships. ACM Trans. Comput.-Hum. Interact., 12, 293–327.
    Borau, S., Otterbring, T., Laporte, S., & Fosso Wamba, S. (2021). The most human bot: Female gendering increases humanness perceptions of bots and acceptance of AI. Psychology & Marketing, 38(7), 1052–1068.
    Brandtzaeg, P., & Følstad, A. (2017). Why People Use Chatbots.
    Carrilho, M., Wagner, R., Costa Pinto, D., Gonzalez-Jimenez, H., & Akdim, K. (2025). The feeling skills gap: the role of empathy in voice-driven AI for service recovery. Journal of Business Research, 201, 115703.
    Cet, M., Obaid, M., & Torre, I. (2025). Breaking the Binary: A Systematic Review of Gender-Ambiguous Voices in Human-Computer Interaction.
    Chandra, M., Hernandez, J., Ramos, G., Ershadi, M., Bhattacharjee, A., Amores, J., Okoli, E., Paradiso, A., Warreth, S., & Suh, J. (2025). Longitudinal Study on Social and Emotional Use of AI Conversational Agent.
    Chin, H., Song, H., Baek, G., Shin, M., Jung, C., Cha, M., Choi, J., & Cha, C. (2023). The Potential of Chatbots for Emotional Support and Promoting Mental Well-Being in Different Cultures: Mixed Methods Study. J Med Internet Res, 25, e51712.
    Cuddy, A. J. C., Fiske, S. T., & Glick, P. (2008). Warmth and Competence as Universal Dimensions of Social Perception: The Stereotype Content Model and the BIAS Map. In Advances in Experimental Social Psychology (Vol. 40, pp. 61–149). Academic Press.
    Cutrona, C. E., & Suhr, J. A. (1992). Controllability of stressful events and satisfaction with spouse support behaviors [doi:10.1177/009365092019002002]. Sage Publications.
    Duan, W., Li, L., Freeman, G., & McNeese, N. (2025). A Scoping Review of Gender Stereotypes in Artificial Intelligence.
    Eyssel, F., & Hegel, F. (2012). (S)he's Got the Look: Gender Stereotyping of Robots 1. Journal of Applied Social Psychology, 42.
    Fiske, S., Cuddy, A., Glick, P., & Xu, J. (2002). A model of (often mixed) stereotype content: Competence and warmth respectively follow from perceived status and competition. Journal of personality and social psychology, 82, 878–902.
    Gessinger, I., Cohn, M., Cowan, B., Zellou, G., & Möbius, B. (2023). Cross-linguistic Emotion Perception in Human and TTS Voices.
    Goodman, K., & Mayhorn, C. (2023). It's not what you say but how you say it: Examining the influence of perceived voice assistant gender and pitch on trust and reliance. Applied Ergonomics, 106, 103864.
    Han, H., Lee, Y., Zhang, C., Lu, J., & Wang, L. (2025). He or she? A male-default bias in Chatbot gender attribution across explicit and implicit measures. Computers in Human Behavior Reports, 20, 100860.
    Helme-guizon, A., Broyer, J., Bataoui, S., & Hakimi, M. (2024). He or she? Impact of gender's well-being chatbots on user perceptions and intentions: a study of agency, communality and trust.
    Lee, E.-J., Nass, C., & Brave, S. (2000). Can computer-generated speech have gender? An experimental test of gender stereotype. Conference on Human Factors in Computing Systems - Proceedings.
    Lee, K., & Nass, C. (2003). Designing Social Presence of Social Actors in Human Computer Interaction, Computer Human Interaction 2003 (Vol. 5).
    McTear, M. (2020). Conversational AI: Dialogue Systems, Conversational Agents, and Chatbots. Synthesis Lectures on Human Language Technologies, 13, 1–251.
    Mooshammer, S., Etzrodt, K., & Weidmüller, L. (2025). Trust in gendered voice assistants—the special case of gender ambiguity. Publizistik.
    Mullennix, J., Stern, S., Wilson, S., & Dyson, C.-l. (2003). Social perception of male and female computer synthesized speech. Computers in Human Behavior, 19, 407–424.
    Nass, C., & Moon, Y. (2000). Machines and Mindlessness: Social Responses to Computers. Journal of Social Issues, 56, 81–103.
    Reeves, B., & Nass, C. (1996). The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Pla. Bibliovault OAI Repository, the University of Chicago Press.
    Sanjeewa, R., Iyer, R., Apputhurai, P., Wickramasinghe, N., & Meyer, D. (2024). Perception of Empathy in Mental Healthcare: An Experimental Study Using Voice-Based Conversational Agent Prototypes. (Preprint). JMIR Formative Research, 9.
    Scherr, S., Cao, B., Jiang, C., & Kobayashi, T. (2025). Explaining the Use of AI Chatbots as Context Alignment: Motivations Behind the Use of AI Chatbots Across Contexts and Culture. Computers in Human Behavior, 172, 108738.
    Steeds, M., Claudy, M., Cowan, B., & Suri, A. (2025). Do voice agents affect people’s gender stereotypes? Quantitative investigation of stereotype spillover effects from interacting with gendered voice agents. International Journal of Human-Computer Studies, 207, 103683.
    Sun, X., Shen, T., Jiang, Q., & Jiang, B. (2025). Research on the Impact of an AI Voice Assistant’s Gender and Self-Disclosure Strategies on User Self-Disclosure in Chinese Postpartum Follow-Up Phone Calls. Behavioral Sciences, 15, 184.
    Sutton, S. (2020). Gender Ambiguous, not Genderless: Designing Gender in Voice User Interfaces (VUIs) with sensitivity -(preprint).
    Tymburiba Elian, M., Masuko, S., & Yamanaka, T. (2022). Who are you talking to? Considerations on Designing Gender Ambiguous Voice User Interfaces.
    Yeon, J., Park, Y., & Kim, D. (2023). Is Gender-Neutral AI the Correct Solution to Gender Bias? Using Speech-Based Conversational Agents. Archives of Design Research, 36, 63–91.
    Zhang, Q., Yang, X. J., & Robert, L. (2025). Artificial intelligence voice gender, gender role congruity, and trust in automated vehicles. Scientific Reports, 15.

    QR CODE