| 研究生: |
王旭東 Wang, Hsu-tung |
|---|---|
| 論文名稱: |
可感知人機介面手勢模型之建構 Modeling Situated Gestures for Perceptual Interfaces |
| 指導教授: |
鄭泰昇
Jeng, Tay-sheng |
| 學位類別: |
碩士 Master |
| 系所名稱: |
規劃與設計學院 - 建築學系 Department of Architecture |
| 論文出版年: | 2009 |
| 畢業學年度: | 97 |
| 語文別: | 英文 |
| 論文頁數: | 94 |
| 中文關鍵詞: | 手勢 、人機介面 、自然互動 |
| 外文關鍵詞: | contextualization, perceptual interface, gestural interaction |
| 相關次數: | 點閱:162 下載:5 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
在人機互動的設計上,手勢可作為自然而直覺的溝通管道,而在新的互動模式中,電腦偵測人的肢體動作,解讀其意涵,予以適切之回饋。然而,相較具慣例性的手語系統,手勢具有歧義性,因其在不同的環境涵構下有不同的解讀。本研究目的在建構一套模型,描述並解讀在特定情境中的手勢事件,並且透過民族學的研究方法,分析特定場合中,認知空間、實質空間、與社群空間,三項涵構因子對解讀五類手勢產生之影響,以驗證模型的適用性。本研究以東海建築系大學部設計評圖的場合為研究標的,分析數位錄影資料,擷取203筆手勢為樣本,經過分類後,對照三項涵構因子,結果顯示認知空間對手勢有顯著的影響,其次為社群空間,而實質空間的影響僅只於圖與模型設計媒材之差異,本研究歸納在各項因子的交互作用下,產生手勢意涵之解讀,研究成果修正理論模型的涵構內容,並建構資料庫,作為描述手勢事件的知識基礎。
Perceptual interface captures user’s natural behavior and acts properly according to the message it conveys. Comparing to the well-established sign languages, meanings of gestures are ambiguous and situation-dependent. Their interpretations rely on external information. We proposed a model of situated gestures whose meanings result from association of a context of three dimensions: cognitive, social, and physical. Feasibility of the model was tested on a real site by method of cognitive ethnographic research. The occasion of design presentation and critique was selected as a field site. 203 gestural segments were selected from digital video data. We outlined the contextual factors and examined their impacts on determining five types of gesture patterns. Cognitive space had direct influence them, while social network and physical space determined merely the use of deictic gestures. The result was used to fulfill and revised the conceptual framework for processing meanings of gestures. As for reliability of the perceptual interface system, it was built on the commonsense database of contextual information, including information of project theme, dialogue modes, physical environment, social network, and individual actors. Future work is built on the accumulation of prior knowledge of gesture interpretation.
Abowd, G. D., Mynatt, E. D., & Rodden, T. (2002 ). The human experience. Pervasive Computing, IEEE, 1(1), 48 - 57
Anderson, J. R. (2005). Cognitive psychology and its implications (6th ed.). New York: Worth Publishers.
Becvar, A., Hollan, J., & Hutchins, E. (Eds.). (2008). Representational Gestures as Cognitive Artifacts for Developing Theories in a Scientific Laboratory.
Biederman, I. (1987). Recognition-by-components: A theory of human image understanding. . Psychological Review, 94(2), 115-147.
Bolt, R. A. (1980). “Put-that-there”: Voice and gesture at the graphics interface. Paper presented at the Proceedings of the 7th annual conference on Computer graphics and interactive techniques, Seattle, Washington, United States
Brenda, L. (Ed.). (1990). The Art of human-computer interface design Reading, Mass: Addison-Wesley
Buxton, W. (1986). Chunking and Phrasing and the Design of Human-Computer Dialogues. Paper presented at the Proceedings of the IFIP World Computer Congress, Dublin, Ireland.
Dourish, P. (2001). Where the action is :the foundations of embodied interaction Cambridge, Mass.: MIT Press.
Eshkol, N., & Wachman, A. (1958). Movement notation. London: Weidenfeld and
Nicolson.
Ferscha, A., & Resmerita, S. (2007). Gestural interaction in the pervasive computing landscape Elektrotechnik & Informationstechnik, 124, 17-25.
Forlizzi, J. (2008). The Product Ecology: Understanding Social Product Use and Supporting Design Culture. International Journal of Design 2(1).
Furuyama, N. (2000). Gestural Interaction Between the Instructor and the Learner in Origami Instruction. In D. McNeill (Ed.), Language and Gesture. Cambridge: Cambridge University Press.
Greenfield, A. (2006). Everyware :the dawning age of ubiquitous computing. Berkeley, CA: New Riders.
Hutchins, E. (1996). Cognition in the Wild Cambridge, MA: The MIT Press.
Jeng, T. (1999). Design coordination modeling: a distributed computer environment for managing design activities. Georgia Institute of Technology, Atlanda.
Johnston, T., & Schembri, A. (2007). Australian sign language :an introduction to sign language linguistics Cambridge, UK: Cambridge University Press.
Kahol, K., Tripathi, P., & Panchanathan, S. ( 2004). Automated gesture segmentation from dance sequences. Paper presented at the Sixth IEEE International Conference on Automatic Face and Gesture Recognition.
Krauss, R. M., Chen, Y., & Gottesman, R. F. (Eds.). (2000). Lexical gestures and lexical access: a process model: Cambridge University Press.
Lakoff, G., & Johnson, M. (1980). Metaphors We Live By. Chicago: Univ. of Chicago.
LeBaron, C., & Streeck, J. (2000). Gesture, knowledge, and the world. In D. McNeill (Ed.), Language and Gesture. Cambridge: Cambridge University Press.
Maeda, J. (2006). The Laws of Simplicity: Design, Technology, Business, Life. Cambridge, Mass.: MIT Press
Miller, G. A. (1991). The science of words New York: Scientific American Library.
Nespoulous, J.-L., Perron, P., & Lecours, A. R. (1986). The Biological Foundations of Gestures: Motor and Semiotic Aspects: Lawrence Erlbaum Associates.
Norman, D. A. (1993). Things that make us smart :defending human attributes in the age of the machine. Reading, Mass.: Addison-Wesley Pub. Co.,.
Norman, D. A. (2005). Human-centered design considered harmful. interactions, 12(4), 14 - 19
Pinker, S. (2000). The Language Instinct: How the Mind Creates Language New York: Perennial Classics.
Quek, F. (2004). Gesture Recognition. In W. S. Bainbridge (Ed.), Berkshire encyclopedia of human-computer interaction (Vol. 2, pp. 288-292). Great Barrington, Mass.: Berkshire Pub. Group.
Saffer, D. (2008). Designing Gestural Interfaces: Touchscreens and Interactive Devices: O'Reilly Media, Inc.
Schmidt, A., Adoo, K. A., Takaluoma, A., Tuomela, U., Laerhoven, K. V., & Velde, W. V. D. (1999). Advanced Interaction in Context Paper presented at the First International Symposium on Handheld and Ubiquitous Computing
Schmidt, A., Beigl, M., & Gellersen, H.-W. (1999). There is more to Context than Location. Computers & Graphics Journal, 23(6), 893-902.
Streitz, N. A., Geißler, J., & Holmer, T. (1998). Roomware for Cooperative Buildings: Integrated Design of Architectural Spaces and Information Spaces. Paper presented at the Proceedings of the First International Workshop on Cooperative Buildings, Integrating Information, Organization, and Architecture
Turk, M., & Kolsch, M. (2003). Perceptual Interfaces: University of California, Santa Barbara.
Ullmer, B., & Ishii, H. (2000). Emerging frameworks for tangible user interfaces. IBM Systems Journal 39(3&4).
Weiser, M. (1999). The computer for the 21st century. ACM SIGMOBILE Mobile Computing and Communications Review 3(3), 3-11.
Wilson, A. D., Bobick, A. F., & Cassell, J. (1996). Recovering the Temporal Structure of Natural Gesture. Paper presented at the Second IEEE International Conference on Automatic Face and Gesture Recognition