簡易檢索 / 詳目顯示

研究生: 鄭喬瑋
Cheng, Chiao-Wei
論文名稱: 機械手臂情緒表達系統
Emotion Expressing System of Robotic Arm
指導教授: 侯廷偉
Hou, Ting-Wei
學位類別: 碩士
Master
系所名稱: 工學院 - 工程科學系
Department of Engineering Science
論文出版年: 2018
畢業學年度: 106
語文別: 英文
論文頁數: 24
中文關鍵詞: 機械手臂情緒表達
外文關鍵詞: Robotic Arm, Emotion Expressing
相關次數: 點閱:104下載:7
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • RoboEmotion 是一個讓機械手臂能夠表達人類情緒的系統。在此研究中使用六自由度 (six degrees of freedom) 的機械手臂設計並實作了六種人類基本情緒:開心、難過、害怕、生氣、驚訝與厭惡。第一個五十人的實驗中,使用者在不知道情緒與手臂動作的對應關係下,達到 75% (標準差 0.2) 的正確分辨率。使用者在第二個實驗中被告知正確的手臂動作與情緒配對後,利用 7-point Likert 量表對每一個配對評分,結果達到平均 5.77 分 (標準差 1.31)。我們將 RoboEmotion 發佈於 PyPI (Python Package Index) 上,包含低階的溝通模組、結構化的手臂動作定義格式、以及高階應用函式介面,可供研究人員和互動設計人員使用。在這研究中,RoboEmotion 同時被應用在工業用與非工業用手臂上,除了驗證套件的可移植性和延伸性外,也針對了工業用與非工業用各自的應用情境。

    RoboEmotion, a system that enables robotic arms to express human feelings, wasdesigned and implemented in this study. There are six basic emotions, Happy,Sad, Fear, Anger, Surprise, and Disgust, proposed for the six degree-of-freedomrobotic arm. In the first phase (N=50), we investigated the recognition rate ofthe emotion vocabulary. The second phase (N=78) an evaluation on the level ofappropriateness using a 7-point Likert scale was performed. The results showedthat participants reached 75% (SD=0.2) of accuracy in distinguishing among thesix basic human emotions. Also, they rated 5.77 (SD=1.31) in average for theemotion vocabulary. In addition, RoboEmotion was implemented as an extensibleand modularized Python package, applied to and demonstrated with both a self-assembled robotic arm and an industrial one in this work. The package is open-source and published on Python Package Index.

    中文摘要 i Abstract ii 誌謝 iii List of Tables vi List of Figures vii 1 Introduction 1 1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Thesis Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2 Related Work 3 2.1 6-DoF Robotic Arm Manipulation and Control System . . . . . . . 3 2.2 Affective Computing in Robotics . . . . . . . . . . . . . . . . . . . 3 3 Design: Emotion Vocabulary 5 3.1 Emotion attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 3.2 Design Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 Aggressiveness . . . . . . . . . . . . . . . . . . . . . . 6 Arousal . . . . . . . . . . . . . . . . . . . . . . . . . . 6 3.3 RoboEmotion Arm Movements . . . . . . . . . . . . . . . . . . . . 6 4 User Study: Evaluate The RoboEmotion Design 9 4.1 Phase 1: Emotion Recognition . . . . . . . . . . . . . . . . . . . . . 9 Participant . . . . . . . . . . . . . . . . . . . . . . . . 9 Procedure . . . . . . . . . . . . . . . . . . . . . . . . 9 4.2 Phase 2: Appropriateness Ratings . . . . . . . . . . . . . . . . . . . 10 Participant . . . . . . . . . . . . . . . . . . . . . . . . 10 Procedure . . . . . . . . . . . . . . . . . . . . . . . . 11 4.3 Result and Discussion . . . . . . . . . . . . . . . . . . . . . . . . . 11 4.3.1 Recognition Rate . . . . . . . . . . . . . . . . . . . . . . . . 11 4.3.2 Rating of Appropriateness . . . . . . . . . . . . . . . . . . . 12 4.3.3 Qualitative Feedback . . . . . . . . . . . . . . . . . . . . . . 12 5 System Implementation 14 5.1 Low-Level Communication Module . . . . . . . . . . . . . . . . . . 14 5.2 Structural Representation of Arm Movements . . . . . . . . . . . . 15 5.3 High-Level API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 5.4 Industrial Robotic Arm Implementation . . . . . . . . . . . . . . . 18 6 Conclusion and Future Work 20 6.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 6.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 6.2.1 Application Scenarios . . . . . . . . . . . . . . . . . . . . . . 20 Psychotherapy . . . . . . . . . . . . . . . . . . . . . . 21 Industrial Robotic Arms . . . . . . . . . . . . . . . . 21 6.2.2 Multi-arm Expression . . . . . . . . . . . . . . . . . . . . . . 21 6.2.3 Real-world Evaluation Study . . . . . . . . . . . . . . . . . . 22

    [1] A. Singh, “A dog tail interface for communicating affective states of utility
    robots,” Master’s thesis, University of Manitoba, Canada, 2014.
    [2] N. G. Hockstein, C. G. Gourin, R. A. Faust, and D. J. Terris, “A history of
    robots: from science fiction to surgical robotics,” Journal of Robotic Surgery,
    vol. 1, no. 2, pp. 113–118, Jul 2007.
    [3] P. Ekman, “An argument for basic emotions,” Cognition and Emotion, vol. 6,
    no. 3-4, pp. 169–200, 1992.
    [4] “Python Package Index (PyPI),” accessed: June 12, 2018. [Online].
    Available: https://pypi.org/
    [5] W. Guan Hao, Y. Yee Leck, and L. Hun, “6-DOF PC-Based Robotic Arm
    (PC-ROBOARM) with efficient trajectory planning and speed control,” 2011
    4th International Conference on Mechatronics (ICOM), pp. 1–7, 2011.
    [6] A. Libin and E. Libin, “Person-robot interactions from the robopsychologists’
    point of view: the robotic psychology and robotherapy approach,” IEEE,
    vol. 92, pp. 1789–1803, 2004.
    [7] S. A. Koch, C. Stevens, C. D. Clesi, J. B. Lebersfeld, A. G. Sellers, M. Mcnew,
    F. Biasini, F. R. Amthor, and M. I. Hopkins, “A feasibility study evaluating
    the emotionally expressive robot sam,” International Journal of Social
    Robotics, vol. 9, pp. 601–613, June 2017.
    [8] T. Ribeiro and A. Paiva, “The illusion of robotic life: Principles and practices
    of animation for robots,” in Proceedings of the Seventh Annual ACM/IEEE
    International Conference on Human-Robot Interaction. New York, NY, USA:
    ACM, 2012, pp. 383–390.
    [9] A. Singh and J. E. Young, “A dog tail for communicating robotic states,” in
    Proceedings of the 8th ACM/IEEE International Conference on Human-robot
    Interaction. Piscataway, NJ, USA: IEEE Press, 2013, pp. 417–418.
    [10] A. Singh and J. Young, “Animal-inspired human-robot interaction: A
    robotic tail for communicating state,” in Proceedings of the Seventh Annual
    ACM/IEEE International Conference on Human-Robot Interaction. New
    York, NY, USA: ACM, 2012, pp. 237–238.
    [11] J. Novikova and L. Watts, “A design model of emotional body expressions in
    non-humanoid robots,” in Proceedings of the Second International Conference
    on Human-agent Interaction. New York, NY, USA: ACM, 2014, pp. 353–360.
    [12] W. Wallis and J. George, Introduction to Combinatorics, ser. Discrete Mathematics
    and Its Applications. CRC Press, 2010.
    [13] PyPA, “Package Management System pip,” accessed: June 12, 2018.
    [Online]. Available: https://pip.pypa.io/en/stable/
    [14] M. Banzi, Getting Started with Arduino. Sebastopol, CA: O’Reilly Media,
    2008.
    [15] T. Bray, “The JavaScript Object Notation (JSON) Data Interchange
    Format,” RFC 7159, Mar. 2014, accessed: June 12, 2018. [Online]. Available:
    https://rfc-editor.org/rfc/rfc7159.txt
    [16] D. Girardi, F. Lanubile, and N. Novielli, “Emotion detection using noninvasive
    low cost sensors,” The Seventh International Conference on Affective
    Computing and Intelligent Interaction (ACII), pp. 125–130, 2017.
    [17] M. Quazi, “Human emotion recognition using smart sensors,” Master’s thesis,
    Massey University, Palmerston North, New Zealand, 2012.

    下載圖示 校內:2020-08-13公開
    校外:2020-08-13公開
    QR CODE