研究生: |
鄭景文 Cheng, Ching-Wen |
---|---|
論文名稱: |
基於二維情緒空間圖之機器人臉部情緒表達 2D Emotion Space based Robot Facial Expression |
指導教授: |
李祖聖
Li, Tzuu-Hseng S. |
學位類別: |
碩士 Master |
系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
論文出版年: | 2017 |
畢業學年度: | 105 |
語文別: | 英文 |
論文頁數: | 57 |
中文關鍵詞: | 機器人情緒表達 、粒子群最佳化演算法 、非監督式類神經網路 |
外文關鍵詞: | Robot facial emotion expressions, Particle Swarm Optimization (PSO), unsupervised neural network |
相關次數: | 點閱:107 下載:2 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
在與人溝通的過程中,機器人如何表達自身的情緒,將會是一個重要的議題。本論文提出一個基於二維情緒空間圖,實現機器人臉部情緒表達的方法,其中利用粒子群最佳化演算法產生機器人基本情緒,再透過非監督式類神經網路建立二維情緒空間圖,以合成更多的情緒。本論文透過3維繪圖及彩現軟體(Blender)建立機器人的擬人化臉部模型,作為機器人與人互動的溝通窗口,並透過粒子群最佳化演算法,學習六種基本情緒的臉部特徵參數,包含了開心、生氣、驚訝、恐懼、難過以及中性等六種基本情緒,讓機器人可以透過此人臉模型表達出自身的情緒。除了六種基本情緒,人類情緒表達往往更為複雜豐富,因此,本論文利用非監督式類神經網路,將六種基本情緒映射至二維空間,再透過此六個情緒點建立網狀空間,輸入原本的類神經網路,得到不同的臉部表情特徵,藉此合成更多相關聯的混合情緒,使機器人的情緒表達更加豐富。除此之外,由於此二維情緒空間每個點所表現出的情緒,都與鄰近的點相關聯,因此,透過在二維空間中的移動,即可表現出機器人的連續情緒變化。實驗結果顯示,機器人可以透過辨識互動者的臉部表情,模仿其表情變化,且在與人互動時,辨識互動者的表情與話語,做出合適的回應。由實驗結果可知,本論文所架構之機器人臉部表情系統,讓機器人具備情緒表達的能力,大幅提升其與人互動的能力。
Expressing a suitable emotion greatly affects the efficiency of communication. Therefore, how a robot expresses its emotion during interacting with human becomes an important research issue. This thesis proposes a robot facial emotion expression system based on the anthropomorphic facial model built by the 3D computer graphics software, Blender. We define parameters for the facial structure, such as the position of eyebrows, the openness of mouth, and the size of eyes. The system learns these facial parameters of six basic emotions by the Particle Swarm Optimization (PSO) algorithm, and these six emotions include happiness, fear, anger, surprise, sadness and neutral. For improving the richness of facial emotion expression, the system then generates synthesis emotions by the unsupervised neural network, which maps the six basic emotions to 2 dimensional plane and constructs a mesh space. The adjacent points in the space represent related emotions, so the emotion transition can be represented as the shift on the plane. Finally, two experiments are constructed in this thesis. In the first experiment, the robot detects the facial emotions of the person and imitates her/his emotions. In the second experiment, the robot real-time changes facial emotion expression during communicating with a per-son. All the experiments demonstrate that the robot can express rich and suitable facial emotions by the proposed system.
[1] J. Fink, “Anthropomorphism and human likeness in the design of robots and human-robot interaction,” Social Robotics, vol. 7621, pp. 199-208, 2012.
[2] K. Berns and T. Braum, “Design concept of a human-like robot head,” in Proceedings of the IEEE-RAS International Conference on Humanoid Robots, pp.32-37, 2005, December.
[3] H. Miwa, T. Okuchi, H. Takanobu, and A. Takanishi, “Development of a new human-like head robot WE-4,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and System, vol. 3, pp. 2443-2448, 2002, September.
[4] J. Russell, “A circumplex model of affect,” Journal of Personality and Social Psychology, vol. 39, no. 6, pp. 1161-1178, 1980.
[5] J. Russell and M. Bullock, “Multidimensional scaling of emotional facial expressions: Similarity from preschoolers to adults,” Journal of Personality and Social Psychology, vol. 48, no. 5, pp. 1290-1298, 1985.
[6] K. Dautenhahn, “Methodology and themes of human-robot interaction: a growing research Field,” International Journal of Advanced Robotic Systems, vol. 4, no. 1, pp. 15, 2007.
[7] L. Parker, F. E.Schneider, and A. C. Schultz, Multi-robot systems: From Swarms to intelligent automata, Dordrecht: Springer, 2005.
[8] J. Scholtz, “Theory and evaluation of human robot interactions,” in Proceedings of Annual Hawaii International Conference on System Sciences, p. 125.1, 2003, January.
[9] M. Blow, K. Dautenhahn, A. Appleby, C. Nehaniv, and D. Lee, “The art of designing robot faces,” in Proceeding of the ACM SIGCHI/SIGART Conference on Human-Robot Interaction, pp.331-332, 2006, March.
[10] A. Takanishi, K. Sato, K. Segawa, H. Takanobu, and H. Miwa, “An anthropomorphic head-eye robot expressing emotions based on equations of emotion,” in Proceedings of IEEE International Conference On Robotics and Automation, vol.3, pp.2243-2249, 2000, April.
[11] T. Hashimoto, S. Hitramatsu, T. Tsujim, and H. Kobayashi, “Development of the face robot SAYA for rich facial expressions,” in Proceedings of SICE-ICASE International Joint Conference, pp. 5423-5428, 2006, October.
[12] L. Riek, T. Rabinowitch, B. Chakrabarti, and P. Robinson, “How anthropomorphism affects empathy toward robots,” in Proceedings of The ACM/IEEE International Conference on Human Robot Interaction, pp. 245-246, 2009, March.
[13] E. Broadbent, V. Kumar, X. Li, J. Sollers, R. Stafford, B. MacDonald, and D. Wegner, “Robots with display screens: a robot with a more humanlike face display is perceived to have more mind and a better personality,” PLoS ONE, vol. 8, no. 8, p. e72589, 2013.
[14] M. Grimm and K. Kroschel, “Emotion estimation in speech using a 3d emotion space concept," INTECH Open Access Publisher, 2007.
[15] J. Sato and S. Morishima, “Emotion modeling in speech production using emotion space,” in Proceedings of IEEE International Workshop on Robot and Human Communication, pp. 472-477, 1996, November.
[16] X. Jin and Z. Wang, “An emotion space model for recognition of emotions in spoken Chinese,” in Proceedings of International Conference on Affective Computing and Intelligent Interaction, vol. 3784, pp. 397-402, 2005, October.
[17] S. Morishima and H. Harashima, “Emotion space for analysis and synthesis of facial expression,” in Proceedings of IEEE International Workshop on Robot and Human Communication, pp.188-193, 1993, November.
[18] M. Han, C. Lin, and K. Song, “Robotic emotional expression generation based on mood transition and personality model,” IEEE Transactions on Cybernetics, vol. 43, no. 4, pp. 1290-1303, 2013.
[19] H. Miwa, A. Takanishi, and H. Takanobu, “Experimental study on robot personality for humanoid head robot,” in Proceedings of IEEE International Conference on Intelligent Robots and Systems, vol. 2, pp. 1183-1188, 2001, October.
[20] S. Becker and M. Plumbley, “Unsupervised neural network learning procedures for feature extraction and classification,” Applied Intelligence, vol. 6, no. 3, pp. 185-203, 1996.
[21] A. Ultsch, “Self-organizing neural networks for visualisation and classification,” in Information and classification, edited by O. Opitz, B. Lausen, and R. Klar, Springer Berlin Heidelberg, pp. 307-313, 1993.
[22] P. Ekman and E. L. Rosenberg, What the face reveals: basic and applied studies of spontaneous expression using the facial action coding system, New York: Oxford University Press, 2005.
[23] Emotion API [Online]. Aviable:
https://azure.microsoft.com/zh-tw/services/cognitive-services/emotion/
[24] R. Poli, J. Kennedy and T. Blackwell, “Particle swarm optimization,” Swarm intelligence, vol. 1, no. 1, pp. 33-57, 2007.
[25] C. Reynolds, “Flocks, herds and schools: A distributed behavioral model,” ACM SIGGRAPH Computer Graphics, vol. 21, no. 4, pp. 25-34, 1987.
[26] F. Heppner and U. Grenander, “A stochastic nonlinear model for coordinated bird flocks,” The Ubiquity of Chaos, pp. 233-238, 1990.
[27] S. Morishima and H. Harashima, “Emotion space for analysis and synthesis of facial expression,” in Proceedings of IEEE International Workshop on Robot and Human Communication, pp. 188-193, 1993, November.
[28] J. Hirth, T. Braun, and K. Berns, “Emotion based control architecture for robotics applications,” in Proceedings of Annual Conference on Artificial Intelligence, pp. 464-467, 2007.
[29] E. Hill, D. Han, P. Dumouchel, N. Dehak, T. Quatieri, C. Moehs, M. Oscar-Berman, J. Giordano, T. Simpatico, and K. Blum, “Long term suboxone™ emotional reactivity as measured by automatic detection in speech,” PLoS ONE, vol. 8, no. 7, 2013.
[30] 3D.SK Human photo references for 3d artists and game developers [Online]. Aviable: https://www.3d.sk/photos/showSet/id/149/thumb/small/page/1
[31] Blender [Online]. Aviable:
https://www.blender.org/