簡易檢索 / 詳目顯示

研究生: 蘇冠云
Su, Kuan-Yun
論文名稱: 串流影音平台無接觸手勢互動的介面設計研究
A Study on Touchless Interactional Hand Gesture Interfaces Design for Streaming Video Platforms
指導教授: 蕭世文
Hsiao, Shih-Wen
學位類別: 碩士
Master
系所名稱: 規劃與設計學院 - 工業設計學系
Department of Industrial Design
論文出版年: 2024
畢業學年度: 112
語文別: 英文
論文頁數: 108
中文關鍵詞: 人機互動串流影音平台易用性非接觸式手勢互動使用者經驗設計
外文關鍵詞: Human-Computer Interaction, Streaming Video Platforms, Usability, Touchless Hand Gesture Interaction, User Experience Design
相關次數: 點閱:63下載:4
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本文旨在探討非接觸式互動介面(Touchless Interactional Interfaces)在串流影音平台上的應用。首先,本研究將介紹串流影音平台的發展以及崛起,以及未來居家娛樂設備與形式的發展趨勢。接著,在文獻探討中,本文探討手勢互動的背景,包括手勢的類型和它們在人機互動中的角色,以及不同的手勢感應技術,包括投射方法和控制方式。
    本研究的實驗實作包含兩個階段:前測問卷與易用性實驗。為了深入瞭解使用者對於串流影音平台以及非接觸式手勢操作的使用經驗與期望,本研究透過問卷進行前導研究,並在問卷的最後針對十大易用性原則進行重要度評比,此階段的調查結果將會作為易用性實驗的實驗設計依據。
    易用性實驗的目的為評估手勢互動操作在串流影音平台中的實際效能與可行性,以及探討其對使用者體驗的影響。本研究設計兩種不同的介面,搭配兩種操作模式,一共四個操作方案,分別為:底端控制介面搭配手勢操作模式、底端控制介面搭配游標操作模式、側邊控制介面搭配手勢操作模式、側邊控制介面搭配游標操作模式。並搭配深度攝影機與大型顯示器作為硬體設備,模擬串流影音平台以無接觸手勢互動進行影片篩選的情境。在實驗中紀錄受測者達成目標的時間與錯誤率,以及SUS量表的主觀評價分數,並透過模糊綜合評價法評估各種搭配組合的特性與優劣。研究的結果顯示,底端控制介面搭配游標操作模式為四個方案中最佳的組合。
    因應串流影音平台日益增長的使用需求,更簡便且直觀的居家娛樂設備互動方法在人機互動研究領域中,勢必會成為熱門的研究主題。本研究透過兩階段的實驗實作,以評估非接觸式互動介面在串流影音平台上的可行性,同時提供了一個具體的研究框架和方法,以實現此目標。

    This study focuses on exploring the usability of touchless interactional interfaces, specifically their application on streaming media platforms. Firstly, this research will introduce the development and rise of streaming media platforms, as well as the future trends of home entertainment devices and formats. Following this, the literature review will cover the background of gesture interaction, including the types of gestures and their roles in human-computer interaction, as well as different gesture sensing technologies, including projection methods and control modes.
    The experimental implementation of this study consists of two phases: a pre-test questionnaire and a usability experiment. To gain a deeper understanding of users' experiences and expectations regarding streaming video platforms and touchless interactional operations, this study conducts a preliminary survey through a questionnaire. The results of this phase serve as the basis for the experimental design of the usability test.
    The goal of the usability experiment is to evaluate the actual performance and feasibility of gesture interaction operations on streaming media platforms and to explore their impact on user experience. We designed two different interfaces paired with two operation modes, resulting in four operation schemes: Bottom Control Interface with Gesture Operation mode, Bottom Control Interface with Cursor Operation mode, Side Control Interface with Gesture Operation mode, and Side Control Interface with Cursor Operation mode. A depth camera and a large display are used as hardware equipment to simulate the scenario of selecting videos on a streaming media platform using touchless gesture interaction. The experiment records the time and error rate for participants to achieve the target, as well as subjective evaluation scores using the SUS (System Usability Scale). A fuzzy comprehensive evaluation method is employed to assess the characteristics and advantages of each combination. The results indicate that the bottom control interface paired with the cursor operation mode is the best combination among the four schemes.
    More convenient and intuitive home entertainment device interactions are bound to become a popular research topic in the field of human-computer interaction. This study, through a two-phase experimental implementation, evaluates the feasibility of touchless interactional interfaces on streaming media platforms and provides a concrete research framework and methodology to achieve this goal.

    摘要 i SUMMARY ii ACKNOWLEDGEMENTS iv TABLE OF CONTENTS v LIST OF TABLES viii LIST OF FIGURES x CHAPTER 1 INTRODUCTION 1 1.1 Research Background 1 1.1.1 Media Development and Streaming Media 1 1.1.2 Human-Computer Interaction for Home Devices 2 1.2 Research Motivation 5 1.3 Research Purpose 7 1.4 Research Scope and Limitations 8 1.5 Research Structure 9 CHAPTER 2 LITERATURE REVIEW 10 2.1 Overview of Literature Structure 10 2.2 Definition and Classification of Gestures 11 2.2.1 Communicative gestures 13 2.2.2 Manipulative gestures 16 2.3 Gestures and Human-Computer Interaction Interfaces 17 2.3.1 Definition and Classification of Gestures in HCI 18 2.3.2 Gesture Selection 18 2.3.3 Gesture-Based Interface Design 19 2.3.4 Gesture and Human-Computer Interaction Technology Applications 22 2.4 Interface Design 24 2.4.1 Interface Design Development 24 2.4.2 Usability Research 25 CHAPTER 3 RESEARCH METHODS 29 3.1 Fitts' Law 29 3.2 Usability Testing 30 3.3 Fuzzy Comprehensive Evaluation 33 CHAPTER 4 RESEARCH PROCESS AND EXPERIMENT 37 4.1 Pre-Test Questionnaire 38 4.1.1 Pre-Test Questionnaire Objectives 38 4.1.2 Pre-Test Questionnaire Structure 38 4.1.3 Pre-Test Questionnaire Results 42 4.1.4 Conclusion of the Pre-Test Questionnaire Survey 47 4.2 Usability Experiment 47 4.2.1 Experiment Objective 47 4.2.2 Experiment Hardware and Software 48 4.2.3 Experiment Environment 49 4.2.4 Experiment Variables 50 4.2.5 Variable Combinations 55 4.2.6 Experiment Execution Process and Tasks 55 4.2.7 Experiment Participants 58 CHAPTER 5 EXPERIMENTAL DATA ANALYSIS RESULTS 59 5.1 Completion Time and Error Rate 60 5.2 Analysis of Control Interface and Task Difficulty 61 5.3 Analysis of Control Modes and Task Difficulty 63 5.4 Intuitive Task Results 66 5.5 System Usability Scale (SUS) Results 66 CHAPTER 6 CONCLUXION AND SUGGESTIONS 69 6.1 Comprehensive Analysis 69 6.1.1 Comprehensive Performance of Different Interfaces and Control Modes 69 6.1.2 Correlation Between Operational Performance and SUS Scores 70 6.1.3 Impact of Task Difficulty on Operational Efficiency 71 6.2 Advantages and Improvement of Each Scheme 71 6.2.1 Comparison of Control Modes 71 6.2.2 Comparison of Control Interfaces 72 6.2.3 Fuzzy Comprehensive Evaluation 74 6.3 Future Research Directions 78 REFERENCES 80 Appendix A Pre-Test Questionnaire 88

    Aguilar-Lazcano, C. A., & Rechy-Ramirez, E. J. (2020). Performance analysis of Leap motion controller for finger rehabilitation using serious games in two lighting environments. Measurement, 157, 107677.
    Ameur, S., Khalifa, A. B., & Bouhlel, M. S. (2020, July). Hand-gesture-based touchless exploration of medical images with leap motion controller. In 2020 17th International multi-conference on systems, signals & devices (SSD) (pp. 6-11). IEEE.
    Balakrishnan, R. (2004). “Beating” Fitts’ law: virtual enhancements for pointing facilitation. International Journal of Human-Computer Studies, 61(6), 857-874.
    Baudel, T., & Beaudouin-Lafon, M. (1993). Charade: remote control of objects using free-hand gestures. Communications of the ACM, 36(7), 28-35.
    Bevan, N. (1995). Human-computer interaction standards. In Advances in human factors/ergonomics (Vol. 20, pp. 885-890). Elsevier.
    Beyer, G., & Meier, M. (2011). Music Interfaces for Novice Users: Composing Music on a Public Display with Hand Gestures. In NIME (pp. 507-510).
    Bhiri, N. M., Ameur, S., Alouani, I., Mahjoub, M. A., & Khalifa, A. B. (2023). Hand gesture recognition with focus on leap motion: An overview, real world challenges and future directions. Expert Systems with Applications, 120125.
    Bolt, R. A. (1980, July). “Put-that-there” Voice and gesture at the graphics interface. In Proceedings of the 7th annual conference on Computer graphics and interactive techniques (pp. 262-270).
    Boulabiar, M. I., Burger, T., Poirier, F., & Coppin, G. (2011). A low-cost natural user interaction based on a camera hand-gestures recognizer. In Human-Computer Interaction. Interaction Techniques and Environments: 14th International Conference, HCI International 2011, Orlando, FL, USA, July 9-14, 2011, Proceedings, Part II 14 (pp. 214-221). Springer Berlin Heidelberg.
    Brooke, J. (1996). Sus: a “quick and dirty’usability. Usability evaluation in industry, 189(3), 189-194.
    Buchmann, V., Violich, S., Billinghurst, M., & Cockburn, A. (2004, June). FingARtips: gesture based direct manipulation in Augmented Reality. In Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia (pp. 212-221).
    Cassell, J. (1998). A framework for gesture generation and interpretation. Computer vision in human-machine interaction, 191-215.
    Chagas, D. A., & Furtado, E. S. (2013, November). MoveRC: attention-aware remote control. In Proceedings of the 19th Brazilian symposium on Multimedia and the web (pp. 277-280).
    Choi, E., Kwon, S., Lee, D., Lee, H., & Chung, M. K. (2014). Towards successful user interaction with systems: Focusing on user-derived gestures for smart home systems. Applied ergonomics, 45(4), 1196-1207.
    Dinh, D. L., Kim, J. T., & Kim, T. S. (2014). Hand gesture recognition and interface via a depth imaging sensor for smart home appliances. Energy Procedia, 62, 576-582.
    Fikkert, W., Van Der Vet, P., van der Veer, G., & Nijholt, A. (2010). Gestures for large display control. In Gesture in Embodied Communication and Human-Computer Interaction: 8th International Gesture Workshop, GW 2009, Bielefeld, Germany, February 25-27, 2009, Revised Selected Papers 8 (pp. 245-256). Springer Berlin Heidelberg.
    Fitts, P. M. (1954). The information capacity of the human motor system in controlling the amplitude of movement. Journal of experimental psychology, 47(6), 381.
    Goth, G. (2011). Brave NUI world. Communications of the ACM, 54(12), 14-16.
    Hsiao, S. -W., & Hsiao, H. -H., Liang, S. -M. (2016) Improving product based on affordance with fuzzy theory for product development strategy. International Journal of Production Research, 54 (18): 5523-5533
    Hsiao, S. W., Lee, C. H., Yang, M. H., & Chen, R. Q. (2017). User interface based on natural interaction design for seniors. Computers in Human Behavior, 75, 147-159.
    Jota, R., Pereira, J. M., & Jorge, J. A. (2009). A comparative study of interaction metaphors for large-scale displays. In CHI'09 Extended Abstracts on Human Factors in Computing Systems (pp. 4135-4140).
    Kendon, A. (1988). How gestures can become like words. In This paper is a revision of a paper presented to the American Anthropological Association, Chicago, Dec 1983.. Hogrefe & Huber Publishers.
    Kim, H., Albuquerque, G., Havemann, S., & Fellner, D. W. (2005). Tangible 3D: Hand Gesture Interaction for Immersive 3D Modeling. IPT/EGVE, 2005, 191-9.
    Koenig, A., y Baena, F. R., & Secoli, R. (2021, August). Gesture-based teleoperated grasping for educational robotics. In 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN) (pp. 222-228). IEEE.
    Kopper, R., Bacim, F., & Bowman, D. A. (2011, March). Rapid and accurate 3D selection by progressive refinement. In 2011 IEEE symposium on 3D user interfaces (3DUI) (pp. 67-74). IEEE.
    Kopper, R., Bowman, D. A., Silva, M. G., & McMahan, R. P. (2010). A human motor behavior model for distal pointing tasks. International journal of human-computer studies, 68(10), 603-615.
    L.A. Zadeh (1965) Fuzzy Set. Information and Control, 8(3) : 338-353
    Löcken, A., Hesselmann, T., Pielot, M., Henze, N., & Boll, S. (2012). User-centred process for the definition of free-hand gestures applied to controlling music playback. Multimedia systems, 18, 15-31.
    Lou, X., Peng, R., Hansen, P., & Li, X. A. (2018). Effects of user’s hand orientation and spatial movements on free hand interactions with large displays. International Journal of Human–Computer Interaction, 34(6), 519-532.
    MacKenzie, I. S. (1992). Fitts' law as a research and design tool in human-computer interaction. Human-computer interaction, 7(1), 91-139.
    McNeill, D. (1985). So you think gestures are nonverbal?. Psychological review, 92(3), 350.
    McNeill, D. (1987). Psycholinguistics: A new approach. Harper & Row Publishers.
    McNeill, D. (1992). Hand and mind: What gestures reveal about thought. University of Chicago press.
    McNeill, D. (2008). Gesture: A Psycholinguistic Approach. The Encyclopedia of Languages and Linguistics.(Psycholinguistic section), 58–66.
    Morrel-Samuels, P. (1990). Clarifying the distinction between lexical and gestural commands. International Journal of Man-Machine Studies, 32(5), 581-590.
    New, J. R., Hasanbelliu, E., & Aguilar, M. (2003, March). Facilitating user interaction with complex systems via hand gesture recognition. In Proceedings of the 2003 Southeastern ACM Conference, Savannah, GA.
    Ni, T., Bowman, D. A., North, C., & McMahan, R. P. (2011). Design and evaluation of freehand menu selection interfaces using tilt and pinch gestures. International Journal of Human-Computer Studies, 69(9), 551-562.
    Nielsen, J. (1994). Usability engineering. Morgan Kaufmann.
    Nielsen, J. (2005). Ten usability heuristics.
    Norman, D. A., & Nielsen, J. (2010). Gestural interfaces: a step backward in usability. interactions, 17(5), 46-49.
    Norman, Donald A. (2013). The Design of Everyday Things. Revised and Expanded ed. New York: Basic Books.
    Popovici, I., Schipor, O. A., & Vatavu, R. D. (2019). Hover: Exploring cognitive maps and mid-air pointing for television control. International Journal of Human-Computer Studies, 129, 95-107.
    Quek, F. (2004). The catchment feature model: A device for multimodal fusion and a bridge between signal and sense. EURASIP Journal on Advances in Signal Processing, 2004, 1-18.
    Quek, F. K. (1995). Eyes in the interface. Image and vision computing, 13(6), 511-525.
    Quek, F. K. (1996). Unencumbered gestural interaction. IEEE multimedia, 3(4), 36-47.
    Quek, F., McNeill, D., Bryll, R., Duncan, S., Ma, X. F., Kirbas, C., ... & Ansari, R. (2002). Multimodal human discourse: gesture and speech. ACM Transactions on Computer-Human Interaction (TOCHI), 9(3), 171-193.
    Rastgoo, R., Kiani, K., Escalera, S., & Sabokrou, M. (2021). Sign language production: A review. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (pp. 3451-3461).
    Rautaray, S. S., & Agrawal, A. (2015). Vision based hand gesture recognition for human computer interaction: a survey. Artificial intelligence review, 43, 1-54.
    Ren, G., & O'Neill, E. (2012, March). 3D marking menu selection with freehand gestures. In 2012 IEEE Symposium on 3D User Interfaces (3DUI) (pp. 61-68). IEEE.
    Rimé, B., & Schiaratura, L. (1991). Gesture and speech.
    Santos, B. S., Cardoso, J., Ferreira, B. Q., Ferreira, C., & Dias, P. (2016). Developing 3d freehand gesture-based interaction methods for virtual walkthroughs: Using an iterative approach. In Handbook of Research on Human-Computer Interfaces, Developments, and Applications (pp. 52-72). IGI Global.
    Sauro, J. (2011). SUStisfied? Little-known system usability scale facts. User Experience Magazine, 10(3).
    Shannon, C. E. (1948). A mathematical theory of communication. The Bell system technical journal, 27(3), 379-423.
    Sharma, R. P., & Verma, G. K. (2015). Human computer interaction using hand gesture. Procedia Computer Science, 54, 721-727.
    Singh, P. K., Kundu, S., Adhikary, T., Sarkar, R., & Bhattacharjee, D. (2021). Progress of human action recognition research in the last ten years: a comprehensive survey. Archives of Computational Methods in Engineering, 1-41.
    Van Dam, A. (1997). Post-WIMP user interfaces. Communications of the ACM, 40(2), 63-67.
    Vatavu, R. D., & Zaiti, I. A. (2014, June). Leap gestures for TV: insights from an elicitation study. In Proceedings of the ACM International Conference on Interactive Experiences for TV and Online Video (pp. 131-138).
    Vinayak, Murugappan, S., Liu, H., & Ramani, K. (2013). Shape-It-Up: Hand gesture based creative expression of 3D shapes using intelligent generalized cylinders. Computer-Aided Design, 45(2), 277-287.
    Vogel, D., & Balakrishnan, R. (2005, October). Distant freehand pointing and clicking on very large, high-resolution displays. In Proceedings of the 18th annual ACM symposium on User interface software and technology (pp. 33-42).
    Vuletic, T., Duffy, A., Hay, L., McTeague, C., Campbell, G., & Grealy, M. (2019). Systematic literature review of hand gestures used in human computer interaction interfaces. International Journal of Human-Computer Studies, 129, 74-94.
    Wagner, P., Malisz, Z., & Kopp, S. (2014). Gesture and speech in interaction: An overview. Speech Communication, 57, 209-232.
    Wexelblat, A. (1995). An approach to natural gesture in virtual environments. ACM Transactions on Computer-Human Interaction (TOCHI), 2(3), 179-200.
    Wexelblat, A. (1997, September). Research challenges in gesture: Open issues and unsolved problems. In International Gesture Workshop (pp. 1-11). Berlin, Heidelberg: Springer Berlin Heidelberg.
    Zaiţi, I. A., Pentiuc, Ş. G., & Vatavu, R. D. (2015). On free-hand TV control: experimental results on user-elicited gestures with Leap Motion. Personal and Ubiquitous Computing, 19, 821-838.
    Zimmerman, T. G., Lanier, J., Blanchard, C., Bryson, S., & Harvill, Y. (1986). A hand gesture interface device. ACM Sigchi Bulletin, 18(4), 189-192.
    Zizic, M. C., Mladineo, M., Gjeldum, N., & Celent, L. (2022). From industry 4.0 towards industry 5.0: A review and analysis of paradigm shift for the people, organization and technology. Energies, 15(14), 5221.
    陳建雄、王建立. (2021). 以手勢為互動基礎的介面設計學術研究發展: 文獻回顧與綜論. 設計學報 (Journal of Design),26(1),59-82。
    闕頌廉,「應用模糊數學」,科技圖書公司,台北市,1992
    Jakob Nielsen (2000/3/18) Why You Only Need to Test with 5 Users. Retrieved from https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/
    James Solada(2023/2/15) Getting an edge on the competition with the secret weapon of usability research from https://www.questionpro.com/blog/getting-an-edge-on-the-competition-with-the-secret-weapon-of-usability-research/
    Jessy Lee (2020/4/14)。UX Research-易用性測試 (Usability test)-下。取自https://reurl.cc/p3QorZ
    Justwatch-streaming-performance(2023 March ) Retrieved from https://www.spoilertv.com/2023/03/justwatch-streaming-performance-review.html
    Mouseover (2023/6/29) Retrieved from https://en.wikipedia.org/wiki/Mouseover
    OTT服務(2023/9/17) 。取自 https://zh.wikipedia.org/zhtw/OTT%E6%9C%8D%E5%8A%A1

    下載圖示 校內:立即公開
    校外:立即公開
    QR CODE