簡易檢索 / 詳目顯示

研究生: 蕭柔
Hsiao, Rou
論文名稱: 努比是同伴或是工具? 控制, 信任及情緒對人機互動影響之研究
Noopy as a partner or a tool? Control, trust, and emotions in human-robot interaction
指導教授: 簡瑋麒
Chien, Wei-Chi
學位類別: 碩士
Master
系所名稱: 規劃與設計學院 - 工業設計學系
Department of Industrial Design
論文出版年: 2021
畢業學年度: 109
語文別: 英文
論文頁數: 85
中文關鍵詞: 人機互動人-機器人互動體驗設計人對機器人的信任控制欲望社交機器人
外文關鍵詞: human-computer interaction, human-robot interaction, experience design, human trust in robot, desirability of control, social robot
相關次數: 點閱:154下載:37
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 當今,科技變得越加成熟及更加靠近我們的日常生活。我們常常會利用許多機器或是自動化的科技來幫助我們完成日常生活的工作或是增進生活中的娛樂性。比如說使用掃地機器人來幫助我們清潔地板或是跟娛樂性的機器人遊玩 (如:AIBO—日本Sony所研發的電子寵物狗)等等。當我們和機器人相處在一起,或是跟機器人合作的情況,稱為「人機互動」。良好的人機合作可以對人類的生活帶來正向的影響。我們發現在人機合作中,每個人的不同性格特質(如:控制欲望)可能會影響人對機器的信任還有福祉 (人的情緒或心理感受)。
    在本文中,我們設計了一個實驗框架,來探索人類對機器人的信任 (TR: Trust on Robots)、控制欲望 (DC: Desirability of Control)、以及他們在人機協作任務中情緒激發 (EA: Emotional Arousal) 之間的關係。 我們還會對參與者的控制模式 (CP: Control Pattern) 進行量化的分析。實驗框架包括了介紹,測試和分析階段。通過測試得出參與者的TR、EA和CP值,我們可觀察參與者在實驗過程中的情緒變化,從而推斷出不同控制慾望的人類以及控制狀況在人機互動中,對機器人的信任還有他們的情緒之間的關係。我們提出實驗成果以及初步結果的分析,將對人機互動中任何與人對機器人信任相關的研究提供可靠的基礎。
    我們將前述的人因實驗實際應用到社交機器人—努比 (Noopy)的設計上,並在研究室中、以同學為使用進行場域觀察。最後,透過場域觀察與質性訪談,我們討論人對社交機器人控制、擬人化、情緒投射及信任相關的發現,並推論社交機器人設計可靠之應用。

    Nowadays, technology becomes more and more mature and close to our lives. We use automation to accomplish daily works or to obtain joy of life. For example, we use sweeping robots to help us clean floor or play games with entertainment robots (like Sony’s AIBO). In recent studies of human-robot interaction (HRI), researchers shift from usability of the objects to positive experiences of the human subjects. While usability is the concept related to users’ behavior like “usage” or “control”, the focus on positive experience concerns humans’ motivation to “cooperate” or “interact”. From the experience design’s perspective, we are interested in whether one would like to control or cooperate with the robot, and we found that this experience can differ from person to person. In this paper, we construct a research framework and explore the correlation between humans’ trust in robots (TR), humans’ desirability of control (DC), and their emotional quality (EQ) in human-robot cooperative task. Some qualitative analyses were also conducted to identify participants’ different control patterns (CP). By evaluating the resulted TR and DC, we infer that high DC people trust robots more in the cooperative task, and also more controls will increase humans’ TR and EQ.
    In the second part of this project, inspired by the experimental study and the life experience of the author, we designed Noopy – an interactive robot that has its own behavior, emotions, and carrying out snacks for the colleague in the laboratory. Through field observations and interviews, we surveyed users’ control patterns, emotional projection, and trust on robots. Finally, we evaluated that direct control, anthropomorphism, and action feedbacks bring positive social relationship between humans and robots which could apply in social robot design.

    致謝 i 摘要 ii SUMMARY iii TABLE OF CONTENTS iv LIST OF TABLES vi LIST OF FIGURES vii LIST OF SYMBOLS AND ABBREVIATIONS ix CHAPTER 1 INTRODUCTION 1 1.1 What is Human-Robot Interaction (HRI)? 1 1.2 Research Motivation 2 CHAPTER 2 RELATED STUDIES AND PREVIOUS WORKS OF HUMANS’ TRUST IN ROBOTS 4 2.1 Humans’ Trust in Robots 4 2.2 Robots’ Appearance Effects 5 2.3 Robots’ Behavior 6 2.3.1 Emotional Arousal 6 2.3.2 Robots’ Imperfection 7 2.4 Dynamics of Trust in Human-Robot Interaction 8 2.5 Control 8 2.5.1 Humans’ Desirability of Control 8 2.5.2 Human-Robot Relationships 9 CHAPTER 3 A STUDY ABOUT HUMANS’ DESIRABILITY OF CONTROL, TRUST, AND ROBOT’s PERFORMANCE 11 3.1 The Task and the Robot System 11 3.2 Measurement Tools 13 3.3 Experiment Process 15 3.4 Results and Discussions 17 3.4.1 Task Performance Analysis 17 3.4.2 Evaluation of TR and DC 19 3.4.3 Evaluation of EQ, DC, and TR 22 3.4.4 “Happiness” of the Robot 24 3.4.5 Conclusions 24 CHAPTER 4 APPLICATION AND DESIGN 27 4.1 Prior Design Ideation 27 4.1.1 Interview about the Food-Interaction in the Lab 27 4.1.2 Shaping the Experience 28 4.1.3 Noopy 32 4.2 Robot’s Appearance 34 4.3 Prototyping 36 4.4 Internal Structure and Robot System 44 4.5 Noopy in the Field 49 4.5.1 More Familiar, More Interactions and More Adorable 49 4.5.2 Control Patterns with Direct Physical Contact 50 4.5.3 Noopy is Likely a Pet in the Lab 51 4.5.4 The medium of Humans’ Interaction 53 4.6 Limitations 53 CHAPTER 5 CONCLUSIONS 55 5.1 Discussion 55 5.2 Future works 57 REFERENCES 58 Appendix A THE DESIRABILITY OF CONTROL SCALE (DC SCALE) 64 A.1 The original version of the DC scale 64 A.2 The translated in the traditional Chinese version of the DC scale (on Google Form) 66 Appendix B THE SCALES OF TRUST BETWEEN PEOPLE AND AUTOMATION 68 B.1 The original version of the TPA scale 68 B.2 The translated in the traditional Chinese version of the TPA scale 69 Appendix C THE RECORD FORM OF THE OBSERVATION TASK (THE ORIGIN TRADITIONAL CHINESE VERSION) 71 Appendix D THE CIRCUIT DIAGRAM OF THE ROBOT 72 Appendix E AM I CONQUERING THE ROBOT? THE IMPACT OF PERSONALITY ON THE STYLE OF COOPERATION WITH AN AUTOMATIC SYSTEM 73

    Asimov, I. (1950). I, Robot. New York: Gnome Press.
    Bartneck, C., Belpaeme, T., Eyssel, F., Kanda, T., Keijsers, M., & Selma Sabanovi´. (2016). Human-robot interaction an introduction. Cambridge. https://doi.org/10.1017/9781108676649
    Behrens, S. I., Egsvang, A. K. K., Hansen, M., & Møllegrd-Schroll, A. M. (2018). Gendered robot voices and their influence on trust. Proceedings of HRI '18: ACM/IEEE International Conference on Human-Robot Interaction, USA, 63–64. https://doi.org/10.1145/3173386.3177009
    Bethel, C. L., & Murphy, R. R. (2008). Survey of Non-facial/Non-verbal Affective Expressions for Appearance-Constrained Robots. IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews, 38(1), 83–92. https://doi.org/10.1109/TSMCC.2007.905845
    Billings, D. R., Schaefer, K. E., Chen, J. Y. C., & Hancock, P. A. (2016). Human-robot interaction: developing trust in robots. ACM/IEEE International Conference on Human-Robot Interaction, 58(4), 525–532. https://doi.org/10.1177/0018720816644364
    Breazeal, C. (2004). Function meets style: Insights from emotion theory applied to HRI. IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews, 34(2), 187–194. https://doi.org/10.1109/TSMCC.2004.826270
    Burger, J. M., & Cooper, H. M. (1979). The desirability of control. Motivation and Emotion, 3(4), 381–393. https://doi.org/10.1007/BF00994052
    Dautenhahn, K. (2007). Socially intelligent robots: dimensions of human-robot interaction. Philosophical Transactions of the Royal Society B: Biological Sciences, 362(1480), 679–704. https://doi.org/10.1098/rstb.2006.2004
    Desai, M., Medvedev, M., Vázquez, M., McSheehy, S., Gadea-Omelchenko, S., Bruggeman, C., …Yanco, H. (2012). Effects of changing reliability on trust of robot systems. Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction, USA, 73–80. https://doi.org/10.1145/2157689.2157702
    Dirks, K. T., & Ferrin, D. L. (2001). The role of trust in organizational settings. Organization Science, 12(4), 450–467. https://doi.org/10.1287/orsc.12.4.450.10640
    Duffy, B. R. (2003). Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42(3–4), 177–190. https://doi.org/10.1016/S0921-8890(02)00374-3
    Dvash, A., & Mannheim, B. (2001). Technological coupling, job characteristics and operators’ well-being as moderated by desirability of control. Behaviour and Information Technology, 20(3), 225–236. https://doi.org/10.1080/01449290116930
    Freedy, A., DeVisser, E., Weltman, G., & Coeyman, N. (2007). Measurement of trust in human-robot collaboration. Proceedings of the 2007 International Symposium on Collaborative Technologies and Systems, USA, 106–114. https://doi.org/10.1109/CTS.2007.4621745
    Fritz, H. L., & Gallagher, B. P. (2020). Three dimensions of desirability of control: Divergent relations with psychological and physical well-being. Psychology and Health, 35(2), 210–238. https://doi.org/10.1080/08870446.2019.1638512
    Gross, J. J., & Muñoz, R. F. (1995). Emotion regulation and mental health. Clinical Psychology: Science and Practice, 2(2), 151–164. https://doi.org/10.1111/j.1468-2850.1995.tb00036.x
    Grudin, J. (2005). Three faces of human-computer interaction. IEEE Annals of the History of Computing, 27(4), 46–62. https://doi.org/10.1109/MAHC.2005.67
    Hancock, P. A., Billings, D. R., Schaefer, K. E., Chen, J. Y. C., DeVisser, E. J., & Parasuraman, R. (2011). A meta-analysis of factors affecting trust in human-robot interaction. Human Factors, 53(5), 517–527. https://doi.org/10.1177/0018720811417254
    Jian, J., Bisantz, A., & Drury, C. (2000). Foundations for an empirically determined scale of trust in automated systems. International Journal of Cognitive Ergonomics, 4(1), 53–71. https://doi.org/10.1207/S15327566IJCE0401
    Kaniarasu, P., Steinfeld, A., Desai, M., & Yanco, H. (2013). Robot confidence and trust alignment. Proceedings of HRI '13: ACM/IEEE International Conference on Human-Robot Interaction, Japan, 155–156. https://doi.org/10.1109/HRI.2013.6483548
    Kim, K. J., Park, E., Sundar, S. S., & DelPobil, A. P. (2012). The effects of immersive tendency and need to belong on human-robot interaction. HRI’12: Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction, USA, 207–208. https://doi.org/10.1145/2157689.2157758
    Klapperich, H., Uhde, A., & Hassenzahl, M. (2020). Designing everyday automation with wellbeing in mind. Personal and Ubiquitous Computing. 25(2), 763-779. https://doi.org/10.1007/s00779-020-01452-w
    Klug M., Zell A. (2013). Emotion-based human-robot-interaction. 2013 IEEE 9th International Conference on Computational Cybernetics (ICCC), Hungary. https://doi.org/10.1109/ICCCyb.2013.6617620
    Kraus, J. M., Nothdurft, F., Hock, P., Scholz, D., Minker, W., & Baumann, M. (2016). Human after all: Effects of mere presence and social interaction of a humanoid robot as a co-driver in automated driving. Proceedings of Automotive UI'16: 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, US, 129–134. https://doi.org/10.1145/3004323.3004338
    Krämer N., Rosenthal-von der Pütten A. M., Eimler S. (2012). Human-Agent and Human-Robot Interaction Theory: Similarities to and Differences from Human-Human Interaction. Studies in Computational Intelligence. (215-240). https://doi.org/10.1007/978-3-642-25691-2_9
    Lee, J. D., & See, K. A. (2004). Trust in automation: designing for appropriate reliance. Human Factors, 46(1), 50–80. https://doi.org/10.1518/hfes.46.1.50_30392
    Löcken, A., Borojeni, S. S., Müller, H., Gable, T. M., Triberti, S., Diels, C., …Boll, S. (2017). “It’s more fun to commute”—An example of using automotive interaction design to promote well-being in Cars. In Meixner, G. & Müller, C. (Eds.), Automotive User Interfaces (325–348). Springer, Cham. https://doi.org/10.1007/978-3-319-49448-7
    Loffler, D., Dorrenbacher, J., & Hassenzahl, M. (2020). The uncanny valley effect in zoomorphic robots: The U-shaped relation between animal likeness and likeability. Proceedings of HRI '20: ACM/IEEE International Conference on Human-Robot Interaction, USA, 261–270. https://doi.org/10.1145/3319502.3374788
    Lohani, M., Stokes, C., McCoy, M., Bailey, C. A., & Rivers, S. E. (2016). Social interaction moderates human-robot trust-reliance relationship and improves stress coping. Proceedings of the 11th ACM/IEEE International Conference on Human Robot Interaction, USA, 471–472. https://doi.org/10.1109/HRI.2016.7451811
    Merritt, S. M., & Ilgen, D. R. (2008). Not all trust is created equal: dispositional and history-based trust in human-automation interactions. Human Factors, 50(2), 194–210. https://doi.org/10.1518/001872008X288574
    Michelle. M. E. van P. R. W. H., Rüger, J., Pluymaekers, M., & Wetzels, M. (2019). Trust in humanoid robots: Implications for services marketing. Services Marketing, 33(4), 507–518. https://doi.org/10.1108/JSM-01-2018-0045
    Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley. IEEE Robotics and Automation Magazine, 19(2), 98–100. https://doi.org/10.1109/MRA.2012.2192811
    Muir, B. M. (1994). Trust in automation: Part I. theoretical issues in the study of trust and human intervention in automated systems. Ergonomics, 37(11), 1905–1922. https://doi.org/10.1080/00140139408964957
    Natarajan, M., & Gombolay, M. (2020). Effects of anthropomorphism and accountability on trust in human robot interaction. Proceedings of HRI '20: ACM/IEEE International Conference on Human-Robot Interaction, USA, 33–42. https://doi.org/10.1145/3319502.3374839
    Novikova, J., & Watts, L. (2015). Towards artificial emotions to assist social coordination in HRI. International Journal of Social Robotics, 7(1), 77–88. https://doi.org/10.1007/s12369-014-0254-y
    Oh, C., Song, J., Choi, J., Kim, S., Lee, S., & Suh, B. (2018). I lead, you help but only with enough details. Proceedings of the CHI ’18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, USA, 1–13. https://doi.org/10.1145/3173574.3174223
    Osawa, H., Ohmura, R., & Imai, M. (2009). Using attachable humanoid parts for realizing imaginary intention and body image. International Journal of Social Robotics, 1(1), 109–123. https://doi.org/10.1007/s12369-008-0004-0
    Paepcke, S., & Takayama, L. (2010). Judging a bot by its cover: An experiment on expectation setting for personal robots. Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction, Japan, 45–52. https://doi.org/10.1145/1734454.1734472
    Prince, J. E., & Arias, I. (1994). The role of perceived control and the desirability of control among abusive and nonabusive husbands. American Journal of Family Therapy, 22(2), 126–134. https://doi.org/10.1080/01926189408251306
    Riley, V., & Parasuraman, R. (1997). Humans and automation: Use, misuse, disuse, abuse. Human Factors, 39(2), 230–253. https://doi.org/10.1518/001872097778543886
    Robinette, P., Li, W., Allen, R., Howard, A. M., & Wagner, A. R. (2016). Overtrust of robots in emergency evacuation scenarios. Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction, New Zealand, 101–108. https://doi.org/10.1109/HRI.2016.7451740
    Roy, Q., Zhang, F., & Vogel, D. (2019). Automation accuracy is good, but high controllability may be better. Proceedings of the CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, USA, 1–8. https://doi.org/10.1145/3290605.3300750
    Salem M., Lakatos G., Amirabdollahian F., & Dautenhahn K. (2015). Would You Trust a (Faulty) Robot?: Effects of Error, Task Type and Personality on Human-Robot Cooperation and Trust. 10th ACM/IEEE International Conference on Human-Robot Interaction, USA, 141–148.
    Sebo, S. S., Krishnamurthi, P. & Scassellati, B. (2019). 'I don't believe you': Investigating the effects of robot trust violation and repair. Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction, Daegu, Korea (South), 57–65. https://doi.org/10.1109/HRI.2019.8673169
    Shneiderman, B., & Maes, P. (1997). Direct manipulation vs. interface agents. Interactions, 4(6), 42–61. https://doi.org/10.1145/267505.267514
    Sirkin, D., Mok, B., Yang, S., & Ju, W. (2015). Mechanical Ottoman: How robotic furniture offers and withdraws support. Proceedings of HRI '15: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, USA, 11–18. https://doi.org/10.1145/2696454.2696461
    Vredenburg, K., Mao, J. Y., Smith, P. W., & Carey, T. (2002). A survey of user-centered design practice. Proceedings of CHI '02: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 4(1), 471–478. https://doi.org/10.1145/503457.503460
    Wang, L., Marrella, A., & Nardi, D. (2019). Investigating user perceptions of HRI in social contexts. Proceedings of 14th ACM/IEEE International Conference on Human-Robot Interaction, Korea, 544–545. https://doi.org/10.1109/HRI.2019.8673229
    Wagner A. R., Borenstein J., & Howard A. (2018). Overtrust in the Robotic Age. Communications of the ACM, 61(9), 22-24. https://doi.org/10.1145/3241365
    Wensveen, S., & Overbeeke, K. (2003). Fun with your alarm clock: Designing for engaging experiences through emotionally rich interaction. In Blythe, M.A., Overbeeke, K., Monk, A.F., Wright, P.C. (Eds.), Funology (pp. 275–281). Springer Netherlands. https://doi.org/10.1007/1-4020-2967-5_28
    You, S., & Robert, L. P. (2018). Human-robot similarity and willingness to work with a robotic co-worker. Proceedings of HRI '18: Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, USA, 251–260. https://doi.org/10.1145/3171221.3171281
    Yoshiike, Y., Yamaji, Y., Miyake, T., De Silva, P. R. S., Okada, M. (2010). Sociable Trash Box. Proceedings of the 2015 ACM/IEEE International Conference on Human-Robot Interaction, Osaka, Japan, 337. https://doi.org/10.1145/1734454.1734566

    下載圖示 校內:立即公開
    校外:立即公開
    QR CODE