| 研究生: |
黃苡之 Huang, Yi-Chih |
|---|---|
| 論文名稱: |
二維結合三維用於地圖操作之手勢設計 Integrating Surface and Motion Gesture Design on Map Operation |
| 指導教授: |
吳豐光
Wu, Fong-Gong 賴新喜 Lai, Hsin-Hsi |
| 學位類別: |
碩士 Master |
| 系所名稱: |
規劃與設計學院 - 工業設計學系 Department of Industrial Design |
| 論文出版年: | 2016 |
| 畢業學年度: | 104 |
| 語文別: | 英文 |
| 論文頁數: | 92 |
| 中文關鍵詞: | 手勢設計 、動作手勢 |
| 外文關鍵詞: | gesture design, motion gesture, free-form gesture |
| 相關次數: | 點閱:172 下載:5 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
隨著人機互動裝置技術的革新,輸入模式也出現許多嶄新形式,其中手勢控制裝置最為廣泛,通常應用在觸控面板,如手機和平板電腦。除了觸控螢幕外,手勢操作裝置也有一些新的形式,像是觸控投影機,可以不受限於觸控面板的硬體裝置大小,在任意大小的投影畫面上以手指進行操作。
手勢控制的形式分為兩種,直接觸摸到裝置的Surface Gestures與可在三維空間中動作的Motion Gestures,目前Surface Gestures已發展成熟,而Motion Gestures仍待發展。本研究利用Google Map操作進行手勢探索,實驗第一階段進行手勢發想,並篩選出可用手勢,第二階段則利用可用手勢進行實際操作測驗,最後歸納出動作手勢的設計準則。
With the innovation of the human-computer interaction technology, the input model is developed with many new forms, among which gesture control device is the most popular, and is usually applied on touch panel, such as mobile phone and tablet. Besides the touch screen, some new gesture operation devices are developed, such as interactive projector. Without the restriction of the size of touch panel, it can be operated on the projected image in any size.
The gesture control can be divided into two types, surface gestures and motion gestures. The former means touching the device directly, while the latter means operating in 3D space. Currently the surface gestures are development maturely, while the motion gestures are still under development. The study explores the gestures with Google Map operation. In the first stage of the experiment, we create and select the available gestures. In the second stage, we conduct practical operation by using the available gestures. Finally the design principles of motion gestures are summarized.
Billinghurst, S. S., & Vu, K.-P. L. (2015). Touch screen gestures for web browsing tasks. Computers in Human Behavior, 53, 71-81. doi: 10.1016/j.chb.2015.06.012
Boring, S., Baur, D., Butz, A., Gustafson, S., & Baudisch, P. (2010). Touch projector: mobile interaction through video. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, Georgia, USA.
Bragdon, A., Nelson, E., Li, Y., & Hinckley, K. (2011). Experimental analysis of touch-screen gesture designs in mobile environments. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada.
Chaparro, M, R., J, F., M, B., SD, C., & L, S. (2000). Range of motion of the wrist: implications for designing computer input devices for the elderly. Disability & Rehabilitation, 22(13), 633-637.
Choi, E., Kwon, S., Lee, D., Lee, H., & Chung, M. K. (2014). Towards successful user interaction with systems: Focusing on user-derived gestures for smart home systems. Applied Ergonomics, 45(4), 1196-1207.
Dandekar, K., Raju, B. I., & Srinivasan, M. A. (2003). 3-D Finite-Element Models of Human and Monkey Fingertips to Investigate the Mechanics of Tactile Sense. Journal of Biomechanical Engineering, 125(5), 682-691. doi: 10.1115/1.1613673
Forlines, C., Wigdor, D., Shen, C., & Balakrishnan, R. (2007). Direct-touch vs. mouse input for tabletop displays. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, San Jose, California, USA.
Hinrichs, U., & Carpendale, S. (2011). Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada.
Hornyak, T. (2013). FingerLink turns paper into touch screens. from http://www.cnet.com/news/fingerlink-turns-paper-into-touch-screens/
Ibañez, R., Soria, Á., Teyseyre, A., & Campo, M. (2014). Easy gesture recognition for Kinect. Advances in Engineering Software, 76, 171-180. doi: 10.1016/j.advengsoft.2014.07.005
Johanson, B., Hutchins, G., Winograd, T., & Stone, M. (2002). PointRight: experience with flexible input redirection in interactive workspaces. Paper presented at the Proceedings of the 15th annual ACM symposium on User interface software and technology, Paris, France.
Löcken, A., Hesselmann, T., Pielot, M., Henze, N., & Boll, S. (2012). User-centred process for the definition of free-hand gestures applied to controlling music playback. Multimedia Systems, 18(1), 15-31. doi: 10.1007/s00530-011-0240-2
Lee, J. C. (2008). Hacking the Nintendo Wii Remote. Pervasive Computing, IEEE, 7(3), 39-45. doi: 10.1109/MPRV.2008.53
LEIA Display System. (2014). from http://www.leiadisplay.com/
Mac Basics: Multi-Touch gestures. (2013). from http://support.apple.com/kb/HT4721?viewlocale=en_US&locale=en_US
Nielsen, J. (1992). The Usability Engineering Life Cycle. Computer, 25(3), 12-22. doi: 10.1109/2.121503
Nielsen, M., Störring, M., Moeslund, T., & Granum, E. (2004). A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI. In A. Camurri & G. Volpe (Eds.), Gesture-Based Communication in Human-Computer Interaction (Vol. 2915, pp. 409-420): Springer Berlin Heidelberg.
Plasencia, D. M., Joyce, E., & Subramanian, S. (2014). MisTable: reach-through personal screens for tabletops. Paper presented at the Proceedings of the 32nd annual ACM conference on Human factors in computing systems, Toronto, Ontario, Canada.
Prasad, S., Kumar, P., & Sinha, K. P. (2014, 7-9 Aug. 2014). A wireless dynamic gesture user interface for HCI using hand data glove. Paper presented at the Contemporary Computing (IC3), 2014 Seventh International Conference on.
Radhakrishnan, S., Lin, Y., Zeid, I., & Kamarthi, S. (2013). Finger-based multitouch interface for performing 3D CAD operations. Int. J. Hum.-Comput. Stud., 71(3), 261-275. doi: 10.1016/j.ijhcs.2012.07.004
Rempel, D., Camilleri, M. J., & Lee, D. L. (2014). The design of hand gestures for human-computer interaction: Lessons from sign language interpreters. Int. J. Hum.-Comput. Stud., 72(10-11), 728-735. doi: 10.1016/j.ijhcs.2014.05.003
Saffer, D. (2008). Designing Gestural Interfaces: Touchscreens and Interactive Devices: O'Reilly Media, Inc.
Silpasuwanchai, C., & Ren, X. (2015). Designing concurrent full-body gestures for intense gameplay. International Journal of Human-Computer Studies, 80, 1-13. doi: 10.1016/j.ijhcs.2015.02.010
SMART. (2014). SMART LightRaise™ 60wi interactive projector. from http://education.smarttech.com/en/products/lightraise-interactive-projector
Vismile, Fit for the future. from http://www.vismile.com.tw/en/main.html
Wellner, P. (1991). The DigitalDesk calculator: tangible manipulation on a desk top display. Paper presented at the Proceedings of the 4th annual ACM symposium on User interface software and technology, Hilton Head, South Carolina, USA.
Wobbrock, J. O., Morris, M. R., & Wilson, A. D. (2009). User-defined gestures for surface computing. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA.
Zhengyou, Z. (2012). Microsoft Kinect Sensor and Its Effect. MultiMedia, IEEE, 19(2), 4-10. doi: 10.1109/MMUL.2012.24