簡易檢索 / 詳目顯示

研究生: 黃苡之
Huang, Yi-Chih
論文名稱: 二維結合三維用於地圖操作之手勢設計
Integrating Surface and Motion Gesture Design on Map Operation
指導教授: 吳豐光
Wu, Fong-Gong
賴新喜
Lai, Hsin-Hsi
學位類別: 碩士
Master
系所名稱: 規劃與設計學院 - 工業設計學系
Department of Industrial Design
論文出版年: 2016
畢業學年度: 104
語文別: 英文
論文頁數: 92
中文關鍵詞: 手勢設計動作手勢
外文關鍵詞: gesture design, motion gesture, free-form gesture
相關次數: 點閱:172下載:5
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 隨著人機互動裝置技術的革新,輸入模式也出現許多嶄新形式,其中手勢控制裝置最為廣泛,通常應用在觸控面板,如手機和平板電腦。除了觸控螢幕外,手勢操作裝置也有一些新的形式,像是觸控投影機,可以不受限於觸控面板的硬體裝置大小,在任意大小的投影畫面上以手指進行操作。
    手勢控制的形式分為兩種,直接觸摸到裝置的Surface Gestures與可在三維空間中動作的Motion Gestures,目前Surface Gestures已發展成熟,而Motion Gestures仍待發展。本研究利用Google Map操作進行手勢探索,實驗第一階段進行手勢發想,並篩選出可用手勢,第二階段則利用可用手勢進行實際操作測驗,最後歸納出動作手勢的設計準則。

    With the innovation of the human-computer interaction technology, the input model is developed with many new forms, among which gesture control device is the most popular, and is usually applied on touch panel, such as mobile phone and tablet. Besides the touch screen, some new gesture operation devices are developed, such as interactive projector. Without the restriction of the size of touch panel, it can be operated on the projected image in any size.
    The gesture control can be divided into two types, surface gestures and motion gestures. The former means touching the device directly, while the latter means operating in 3D space. Currently the surface gestures are development maturely, while the motion gestures are still under development. The study explores the gestures with Google Map operation. In the first stage of the experiment, we create and select the available gestures. In the second stage, we conduct practical operation by using the available gestures. Finally the design principles of motion gestures are summarized.

    摘要 i SUMMARY ii ACKNOWLEDGEMENTS iii TABLE OF CONTENTS iv LIST OF TABLES vii LIST OF FIGURES viii CHAPTER 1 INTRODUCTION 1 1.1 Background 1 1.1.1 Definition and type of gesture control 1 1.1.2 Evolution of gesture control device 1 1.1.3 Design of gesture 2 1.2 Purpose 3 1.3 Scope 3 1.4 Limitations 4 CHAPTER 2 LITERATURE REVIEW 5 2.1 Advantages and limitations of the interface gestures 5 2.2 Characteristics of gestures 5 2.2.1 Surface gestures operation 6 2.2.2 Motion gestures operation 7 2.3 Gestures design 8 2.3.1 Gesture design principles 8 2.3.2 Gesture interface design 8 2.3.3 Design of gesture 9 2.3.4 Comfort of gesture 10 2.3.5 Use of the environment affect the performance of the gestures 11 CHAPTER 3 METHODS 12 3.1 Experimental purpose and architecture 12 3.2 Experiment 1 13 3.2.1 Collect and select the functions and operations 13 3.2.2 Creating motion gestures 18 3.2.3 Selecting new gestures 21 3.3 Experiment 2 22 3.3.1 Participants 22 3.3.2 Environment 23 3.3.3 Procedures 24 3.3.4 Setting of experimental items 26 CHAPTER 4 RESULTS 27 4.1 Subjective scale of Understanding Degree, Proficiency Level and Frequency of Use for the target functions 27 4.2 Gesture types 29 4.2.1 Subjective questionnaire results of various gestures 30 4.3 Practical operation test 36 4.3.1 Results of confusion test 36 4.3.2 Single-gesture error rate in practical operation 45 4.3.3 Single-gesture response time 47 4.3.4 Gesture selection preference in multi-gesture test 51 4.3.5 Comparison of error rate in single-gesture confusion test and error rate in practice operation 52 4.3.6 Comparison of single-gesture and multi-gesture error rate 52 4.3.7 Gesture confusion condition in practical operation test 53 4.4 Analysis on comprehensive results 54 4.4.1 Comparison of task familiarity and task error rate 54 4.4.2 Classification of gesture motions 55 CHAPTER 5 DISSCUSSION 56 5.1 User’s Understanding Degrees, Proficiency Level and Frequency of Use towards the operation tasks 56 5.2 Discussion on the score factors in the subjective questionnaire for various Gestures 56 5.3 Comparison of error rate in single-gesture confusion test and error rate in practice operation 58 5.4 Discussion on single-gesture response time 59 5.5 Comparison of error rate of multi-gesture task and single-gesture task 60 5.6 Discussion on the reason of errors in practical gesture operation 61 5.7 Discussion on gesture motion properties 62 CHAPTER 6 CONCLUSION 63 6.1 Factors influencing the difficulty of learning gestures 63 6.2 Study suggestions and future development 64 REFERENCES 66 Appendix A The Images of Motion Gestures Collected in Experiment 1 68 Appendix B Subjective questionnaire 81

    Billinghurst, S. S., & Vu, K.-P. L. (2015). Touch screen gestures for web browsing tasks. Computers in Human Behavior, 53, 71-81. doi: 10.1016/j.chb.2015.06.012
    Boring, S., Baur, D., Butz, A., Gustafson, S., & Baudisch, P. (2010). Touch projector: mobile interaction through video. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, Georgia, USA.
    Bragdon, A., Nelson, E., Li, Y., & Hinckley, K. (2011). Experimental analysis of touch-screen gesture designs in mobile environments. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada.
    Chaparro, M, R., J, F., M, B., SD, C., & L, S. (2000). Range of motion of the wrist: implications for designing computer input devices for the elderly. Disability & Rehabilitation, 22(13), 633-637.
    Choi, E., Kwon, S., Lee, D., Lee, H., & Chung, M. K. (2014). Towards successful user interaction with systems: Focusing on user-derived gestures for smart home systems. Applied Ergonomics, 45(4), 1196-1207.
    Dandekar, K., Raju, B. I., & Srinivasan, M. A. (2003). 3-D Finite-Element Models of Human and Monkey Fingertips to Investigate the Mechanics of Tactile Sense. Journal of Biomechanical Engineering, 125(5), 682-691. doi: 10.1115/1.1613673
    Forlines, C., Wigdor, D., Shen, C., & Balakrishnan, R. (2007). Direct-touch vs. mouse input for tabletop displays. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, San Jose, California, USA.
    Hinrichs, U., & Carpendale, S. (2011). Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada.
    Hornyak, T. (2013). FingerLink turns paper into touch screens. from http://www.cnet.com/news/fingerlink-turns-paper-into-touch-screens/
    Ibañez, R., Soria, Á., Teyseyre, A., & Campo, M. (2014). Easy gesture recognition for Kinect. Advances in Engineering Software, 76, 171-180. doi: 10.1016/j.advengsoft.2014.07.005
    Johanson, B., Hutchins, G., Winograd, T., & Stone, M. (2002). PointRight: experience with flexible input redirection in interactive workspaces. Paper presented at the Proceedings of the 15th annual ACM symposium on User interface software and technology, Paris, France.
    Löcken, A., Hesselmann, T., Pielot, M., Henze, N., & Boll, S. (2012). User-centred process for the definition of free-hand gestures applied to controlling music playback. Multimedia Systems, 18(1), 15-31. doi: 10.1007/s00530-011-0240-2
    Lee, J. C. (2008). Hacking the Nintendo Wii Remote. Pervasive Computing, IEEE, 7(3), 39-45. doi: 10.1109/MPRV.2008.53
    LEIA Display System. (2014). from http://www.leiadisplay.com/
    Mac Basics: Multi-Touch gestures. (2013). from http://support.apple.com/kb/HT4721?viewlocale=en_US&locale=en_US
    Nielsen, J. (1992). The Usability Engineering Life Cycle. Computer, 25(3), 12-22. doi: 10.1109/2.121503
    Nielsen, M., Störring, M., Moeslund, T., & Granum, E. (2004). A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI. In A. Camurri & G. Volpe (Eds.), Gesture-Based Communication in Human-Computer Interaction (Vol. 2915, pp. 409-420): Springer Berlin Heidelberg.
    Plasencia, D. M., Joyce, E., & Subramanian, S. (2014). MisTable: reach-through personal screens for tabletops. Paper presented at the Proceedings of the 32nd annual ACM conference on Human factors in computing systems, Toronto, Ontario, Canada.
    Prasad, S., Kumar, P., & Sinha, K. P. (2014, 7-9 Aug. 2014). A wireless dynamic gesture user interface for HCI using hand data glove. Paper presented at the Contemporary Computing (IC3), 2014 Seventh International Conference on.
    Radhakrishnan, S., Lin, Y., Zeid, I., & Kamarthi, S. (2013). Finger-based multitouch interface for performing 3D CAD operations. Int. J. Hum.-Comput. Stud., 71(3), 261-275. doi: 10.1016/j.ijhcs.2012.07.004
    Rempel, D., Camilleri, M. J., & Lee, D. L. (2014). The design of hand gestures for human-computer interaction: Lessons from sign language interpreters. Int. J. Hum.-Comput. Stud., 72(10-11), 728-735. doi: 10.1016/j.ijhcs.2014.05.003
    Saffer, D. (2008). Designing Gestural Interfaces: Touchscreens and Interactive Devices: O'Reilly Media, Inc.
    Silpasuwanchai, C., & Ren, X. (2015). Designing concurrent full-body gestures for intense gameplay. International Journal of Human-Computer Studies, 80, 1-13. doi: 10.1016/j.ijhcs.2015.02.010
    SMART. (2014). SMART LightRaise™ 60wi interactive projector. from http://education.smarttech.com/en/products/lightraise-interactive-projector
    Vismile, Fit for the future. from http://www.vismile.com.tw/en/main.html
    Wellner, P. (1991). The DigitalDesk calculator: tangible manipulation on a desk top display. Paper presented at the Proceedings of the 4th annual ACM symposium on User interface software and technology, Hilton Head, South Carolina, USA.
    Wobbrock, J. O., Morris, M. R., & Wilson, A. D. (2009). User-defined gestures for surface computing. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA.
    Zhengyou, Z. (2012). Microsoft Kinect Sensor and Its Effect. MultiMedia, IEEE, 19(2), 4-10. doi: 10.1109/MMUL.2012.24

    下載圖示
    校外:立即公開
    QR CODE