| 研究生: |
黃奕澤 Huang, Yi-Che |
|---|---|
| 論文名稱: |
探討不同互動方式應用於防疫聊天機器人對用戶持續使用意圖之影響 Investigating the effect of different interaction methods on users' continuance intention toward epidemic prevention chatbots |
| 指導教授: |
侯建任
Hou, Jian-Ren |
| 學位類別: |
碩士 Master |
| 系所名稱: |
管理學院 - 資訊管理研究所 Institute of Information Management |
| 論文出版年: | 2023 |
| 畢業學年度: | 111 |
| 語文別: | 中文 |
| 論文頁數: | 67 |
| 中文關鍵詞: | 防疫聊天機器人 、社會反應理論 、回覆方式 、對話風格 、虛擬形象 |
| 外文關鍵詞: | Epidemic prevention chatbots, Social response theory, response mode, avatar, conversational style |
| 相關次數: | 點閱:113 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
鑒於COVID-19疫情嚴重,人們普遍在社交媒體和報章雜誌上尋找與事件相關的訊息。然而,社交媒體上存在大量錯誤資訊,這對疾病防治機構的應對工作帶來了巨大挑戰。然而,疾管署的官方聊天機器人疾管家自2017年上線以來,儘管有一千萬人追蹤,但其效果微乎其微。如果能有效改善使用者與該聊天機器人的互動體驗,將能大大提升防疫聊天機器人的效力。
因此,本研究旨在通過不同的互動設計改善聊天機器人的使用意願。實驗設計了8種情境,分別為聊天機器人的回覆方式(文字回覆或按鈕回覆)、虛擬形象(社交型或專業型)以及對話風格(社交導向或任務導向)。通過分析不同互動設計之間的交互作用,以找出最適合用戶的互動設計。本研究在問卷中加入了感知有用性、感知易用性、社會臨場感和感知說服力等相關問項,並使用結構方程模型來評估各構念的效果。
研究結果顯示,按鈕回覆的使用以及專業風格的虛擬形象能夠有效提升使用者對聊天機器人的感知有用性和感知易用性。在對話風格方面,使用社交導向的對話風格能夠帶來更好的社會臨場感。此外,研究還發現對話風格和虛擬形象之間存在交互作用,當對話風格與虛擬形象一致時,可以更好地提升使用者的感知有用性和社會臨場感。另外,問卷調查結果進一步顯示,感知有用性、感知易用性和社會臨場感對使用者的感知說服力、滿意度及持續使用意圖具有正向影響。
Due to the COVID-19 pandemic, the prevalence of misinformation on social media presents challenges for disease control agencies. Despite its large user following, the official epidemic prevention chatbot has been ineffective. Improving the user's interaction experience with this chatbot is crucial for enhancing its effectiveness in epidemic prevention.
Therefore, this study involves eight scenarios, including the chatbot's response mode (text-based or button-based), avatar (social or professional), and conversational style (social-oriented or task-oriented). By analyzing the interaction effects among different interaction designs, the study seeks to identify the most suitable design for users. The survey includes items related to perceived usefulness, perceived ease of use, social presence, and perceived persuasiveness, and a structural equation model is used to evaluate the effects of these constructs.
The findings of the study demonstrate that the use of button-based responses and a professional avatar effectively enhances the user's perceived usefulness and perceived ease of use of the chatbot. Additionally, employing a social-oriented conversational style contributes to a stronger sense of social presence. Furthermore, the study reveals an interaction effect between conversational style and avatar, indicating that aligning these factors improves the user's perceived usefulness and social presence. Moreover, the survey results further highlight the positive impact of perceived usefulness, perceived ease of use, and social presence on the user's perceived persuasiveness, satisfaction, and intention to continue using the chatbot.
Ashfaq, M., Yun, J., Yu, S. & Loureiro, S. M. C. (2020). I, Chatbot: Modeling the determinants of users’ satisfaction and continuance intention of AI-powered service agents. Telematics and Informatics, 54, 101473.
Bhattacherjee, A. (2001). Understanding information systems continuance: An expectation-confirmation model. MIS quarterly, 351-370.
Bickmore, T. & Cassell, J. (2001). Relational agents: a model and implementation of building user trust. Paper presented at the Proceedings of the SIGCHI conference on Human factors in computing systems.
Brachten, F., Kissmer, T. & Stieglitz, S. (2021). The acceptance of chatbots in an enterprise context–A survey study. International Journal of Information Management, 60, 102375.
Calisir, F. & Calisir, F. (2004). The relation of interface usability characteristics, perceived usefulness, and perceived ease of use to end-user satisfaction with enterprise resource planning (ERP) systems. Computers in Human Behavior, 20(4), 505-515.
Chaiken, S. (1980). Heuristic versus systematic information processing and the use of source versus message cues in persuasion. Journal of personality and social psychology, 39(5), 752.
Chang, I.-C., Shih, Y.-S. & Kuo, K.-M. (2022). Why would you use medical chatbots? interview and survey. International Journal of Medical Informatics, 165, 104827.
Chattaraman, V., Kwon, W.-S., Gilbert, J. E. & Ross, K. (2019). Should AI-Based, conversational digital assistants employ social-or task-oriented interaction style? A task-competency and reciprocity perspective for older adults. Computers in Human Behavior, 90, 315-330.
Chen, Q., Gong, Y., Lu, Y. & Tang, J. (2022). Classifying and measuring the service quality of AI chatbot in frontline service. Journal of Business Research, 145, 552-568.
Chung, M., Ko, E., Joung, H. & Kim, S. J. (2020). Chatbot e-service and customer satisfaction regarding luxury brands. Journal of Business Research, 117, 587-595.
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS quarterly, 319-340.
De Gennaro, M., Krumhuber, E. G. & Lucas, G. (2020). Effectiveness of an empathic chatbot in combating adverse effects of social exclusion on mood. Frontiers in psychology, 3061.
Etemad-Sajadi, R. & Ghachem, L. (2015). The impact of hedonic and utilitarian value of online avatars on e-service quality. Computers in Human Behavior, 52, 81-86.
Fornell, C. & Larcker, D. F. (1981). Structural equation models with unobservable variables and measurement error: Algebra and statistics. In: Sage Publications Sage CA: Los Angeles, CA.
Ghazali, A. S., Ham, J., Barakova, E. & Markopoulos, P. (2018). The influence of social cues in persuasive social robots on psychological reactance and compliance. Computers in Human Behavior, 87, 58-65.
Go, E. & Sundar, S. S. (2019). Humanizing chatbots: The effects of visual, identity and conversational cues on humanness perceptions. Computers in Human Behavior, 97, 304-316.
Gunawardena, C. N. & Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer‐mediated conferencing environment. American journal of distance education, 11(3), 8-26.
Hassenzahl, M. (2003). The thing and I: understanding the relationship between user and product. In Funology (pp. 31-42): Springer.
Haugeland, I. K. F., Følstad, A., Taylor, C. & Bjørkli, C. A. (2022). Understanding the user experience of customer service chatbots: An experimental study of chatbot interaction design. International Journal of Human-Computer Studies, 161, 102788.
Huang, S. Y. & Lee, C.-J. (2022). Predicting continuance intention to fintech chatbot. Computers in Human Behavior, 129, 107027.
Jain, M., Kumar, P., Kota, R. & Patel, S. N. (2018). Evaluating and informing the design of chatbots. Paper presented at the Proceedings of the 2018 designing interactive systems conference.
Jiang, Y., Yang, X. & Zheng, T. (2023). Make chatbots more adaptive: Dual pathways linking human-like cues and tailored response to trust in interactions with chatbots. Computers in Human Behavior, 138, 107485.
Jones, C. L. E., Hancock, T., Kazandjian, B. & Voorhees, C. M. (2022). Engaging the Avatar: The effects of authenticity signals during chat-based service recoveries. Journal of Business Research, 144, 703-716.
Kang, H. & Kim, K. J. (2022). Does humanization or machinization make the IoT persuasive? The effects of source orientation and social presence. Computers in Human Behavior, 129, 107152.
Kim, D. Y. & Kim, H.-Y. (2021). Influencer advertising on social media: The multiple inference model on influencer-product congruence and sponsorship disclosure. Journal of Business Research, 130, 405-415.
Kim, J., Merrill Jr, K., Xu, K. & Kelly, S. (2022). Perceived credibility of an AI instructor in online education: The role of social presence and voice features. Computers in Human Behavior, 136, 107383.
Kim, K. J. (2016). Interacting socially with the Internet of Things (IoT): effects of source attribution and specialization in human–IoT interaction. Journal of Computer-Mediated Communication, 21(6), 420-435.
Kim, Y. & Sundar, S. S. (2012). Anthropomorphism of computers: Is it mindful or mindless? Computers in Human Behavior, 28(1), 241-250.
Konya-Baumbach, E., Biller, M. & von Janda, S. (2022). Someone out there? A study on the social presence of anthropomorphized chatbots. Computers in Human Behavior, 107513.
Lee, K. M. (2004). Presence, explicated. Communication theory, 14(1), 27-50.
Lee, S. & Choi, J. (2017). Enhancing user experience with conversational agent for movie recommendation: Effects of self-disclosure and reciprocity. International Journal of Human-Computer Studies, 103, 95-105.
Liu, F., Ngai, E. & Ju, X. (2019). Understanding mobile health service use: An investigation of routine and emergency use intentions. International Journal of Information Management, 45, 107-117.
Martin, A., et al. (2020). An artificial intelligence-based first-line defence against COVID-19: digitally screening citizens for risks via a chatbot. Scientific reports, 10(1), 1-7.
Montenegro, J. L. Z., da Costa, C. A. & da Rosa Righi, R. (2019). Survey of conversational agents in health. Expert Systems with Applications, 129, 56-67.
Moon, Y. (2000). Intimate exchanges: Using computers to elicit self-disclosure from consumers. Journal of consumer research, 26(4), 323-339.
Munnukka, J., Talvitie-Lamberg, K. & Maity, D. (2022). Anthropomorphism and social presence in Human–Virtual service assistant interactions: The role of dialog length and attitudes. Computers in Human Behavior, 107343.
Nass, C., Fogg, B. & Moon, Y. (1996). Can computers be teammates? International Journal of Human-Computer Studies, 45(6), 669-678.
Nass, C., Moon, Y., Fogg, B. J., Reeves, B. & Dryer, D. C. (1995). Can computer personalities be human personalities? International Journal of Human-Computer Studies, 43(2), 223-239.
Nass, C., Steuer, J. & Tauber, E. R. (1994). Computers are social actors. Paper presented at the Proceedings of the SIGCHI conference on Human factors in computing systems.
Pengnate, S. F. & Sarathy, R. (2017). An experimental investigation of the influence of website emotional design features on trust in unfamiliar online vendors. Computers in Human Behavior, 67, 49-60.
Pizzi, G., Scarpi, D. & Pantano, E. (2021). Artificial intelligence and the new forms of interaction: Who has the control when interacting with a chatbot? Journal of Business Research, 129, 878-890.
Reeves, B. & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people. Cambridge, UK, 10, 236605.
Rhee, C. E. & Choi, J. (2020). Effects of personalization and social role in voice shopping: An experimental study on product recommendation by a conversational voice agent. Computers in Human Behavior, 109, 106359.
Rhim, J., Kwak, M., Gong, Y. & Gweon, G. (2022). Application of humanization to survey chatbots: Change in chatbot perception, interaction experience, and survey data quality. Computers in Human Behavior, 126, 107034.
Rizomyliotis, I., Kastanakis, M. N., Giovanis, A., Konstantoulaki, K. & Kostopoulos, I. (2022). “How mAy I help you today?” The use of AI chatbots in small family businesses and the moderating role of customer affective commitment. Journal of Business Research, 153, 329-340.
Sah, Y. J. & Peng, W. (2015). Effects of visual and linguistic anthropomorphic cues on social perception, self-awareness, and information disclosure in a health website. Computers in Human Behavior, 45, 392-401.
Schmuck, D. & von Sikorski, C. (2020). Perceived threats from social bots: The media's role in supporting literacy. Computers in Human Behavior, 113, 106507.
Segev, S., Wang, W. & Fernandes, J. (2014). The effects of ad–context congruency on responses to advertising in blogs. International Journal of Advertising, 33(1), 17-36. Retrieved from https://doi.org/10.2501/IJA-33-1-017-036. doi:10.2501/IJA-33-1-017-036
Seitz, L., Bekmeier-Feuerhahn, S. & Gohil, K. (2022). Can we trust a chatbot like a physician? A qualitative study on understanding the emergence of trust toward diagnostic chatbots. International Journal of Human-Computer Studies, 165, 102848.
Shevat, A. (2017). Designing bots: Creating conversational experiences: " O'Reilly Media, Inc.".
Skalski, P. & Tamborini, R. (2007). The role of social presence in interactive agent-based persuasion. Media psychology, 10(3), 385-413.
Sprecher, S., Treger, S., Wondra, J. D., Hilaire, N. & Wallpe, K. (2013). Taking turns: Reciprocal self-disclosure promotes liking in initial interactions. Journal of Experimental Social Psychology, 49(5), 860-866.
Sundar, S. S. (2009). Media effects 2.0: Social and psychological effects of communication technologies: na.
Sundar, S. S., Bellur, S., Oh, J., Jia, H. & Kim, H.-S. (2016). Theoretical importance of contingency in human-computer interaction: Effects of message interactivity on user engagement. Communication Research, 43(5), 595-625.
Taylor, S. & Todd, P. A. (1995). Understanding information technology usage: A test of competing models. Information systems research, 6(2), 144-176.
Thom, D. H. & Campbell, B. (1997). Fotlg INALRESEARCH patient-physician trust: an exploratory study. J Fam Pract, 44(2), 169.
Thong, J. Y., Hong, W. & Tam, K.-Y. (2002). Understanding user acceptance of digital libraries: what are the roles of interface characteristics, organizational context, and individual differences? International Journal of Human-Computer Studies, 57(3), 215-242.
Van den Broeck, E., Zarouali, B. & Poels, K. (2019). Chatbot advertising effectiveness: When does the message get through? Computers in Human Behavior, 98, 150-157.
Xu, K., Chen, X. & Huang, L. (2022). Deep mind in social responses to technologies: A new approach to explaining the Computers are Social Actors phenomena. Computers in Human Behavior, 134, 107321.
Zhang, A. & Rau, P.-L. P. (2023). Tools or peers? Impacts of anthropomorphism level and social role on emotional attachment and disclosure tendency towards intelligent agents. Computers in Human Behavior, 138, 107415.
Zhu, Y., Zhang, J., Wu, J. & Liu, Y. (2022). AI is better when I'm sure: The influence of certainty of needs on consumers' acceptance of AI chatbots. Journal of Business Research, 150, 642-652.
Zogaj, A., Mähner, P. M., Yang, L. & Tscheulin, D. K. (2023). It’sa Match! The effects of chatbot anthropomorphization and chatbot gender on consumer behavior. Journal of Business Research, 155, 113412.
衛生福利部疾病管制署. (2022). Retrieved from https://www.cdc.gov.tw/
趨勢科技. (2022). 面對疫情人心惶惶 ! 詐騙、謠言滿天飛 趨勢科技防詐達人打擊假消息有一套 呼籲民眾共同對抗假訊息蔓延. Retrieved from https://www.trendmicro.com/zh_tw/about/newsroom/press-releases/2020/2020-03-25.html.
校內:2028-07-10公開