簡易檢索 / 詳目顯示

研究生: 曾宣寶
Zeng, Syuan-Bao
論文名稱: 基於類神經網路及Leap Motion手勢辨識系統之實現
Implementation of Hand Gesture Recognition System Based on Leap Motion and Neural Network
指導教授: 廖德祿
Liao, Teh-Lu
學位類別: 碩士
Master
系所名稱: 工學院 - 工程科學系
Department of Engineering Science
論文出版年: 2015
畢業學年度: 103
語文別: 英文
論文頁數: 55
中文關鍵詞: Leap Motion類神經網路手勢辨識圖形使用者界面
外文關鍵詞: Leap Motion, neural network, hand gesture recognition, graphical user interface
相關次數: 點閱:86下載:4
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 隨著科技的進步,人與機器人的溝通方式也逐漸地改變,以簡單、直覺性的操作為最大宗。近年來,行動機器在機器人的研究領域中被廣泛地討論,行動機器以遙控車和無人飛機為目前最主要且被討論的主題。其主要目的之一乃可以應用在危險的環境中執行任務,包含有救援、勘查地形、追蹤,而傳統控制行動機器的動作,都是以遙控器和電腦端下指令來操作。除此之外,對於未受過操作訓練的使用者在操作行動機器時,會遭遇不易掌控方向、速度等機器人姿態變化等困難。本研究中以最簡單的手勢動作控制行動機器,以期能有效地讓使用者減少時間在遙控器上的訓練操作。
    本論文內容主要分成體感測器、軟體界面與辨識演算法三大部分:體感測器使用Leap Motion, Leap Motion係以紅外線與CMOS攝影機感測技術的完整手部動作感測平台,可擷取到手部基本資訊,包含有手指的空間座標位置、移動速度、角度、方向。軟體界面使用Visual Studio 2012平台的開發界面C sharp(C#)開發圖形使用者界面(Graphical User Interface)平台,結合Leap Motion官方所提供的軟體開發套件SDK(Software Development Kit)來開發手勢姿勢辨識,並將辨識之手指資訊顯示在使用者界面平台上。辨識演算法以倒傳遞類神經演算法辨識與學習,辨識出手勢數字動作有0,1,2,3, … 9,並分別擷取手部特徵,包含手指長度與手指間之夾角,採用三層倒傳遞類神經網路作為辨識架構。利用左右手辨識結果,交互組成控制行動機器之姿態變化等組合,最高可達99種動作。
    本研究結合體感測器Leap Motion偵測手部特徵,與以C sharp(C#)開發之特徵辨識系統與圖形使用者界面,可辨識出使用者正確手部數字動作與其組合,結合遠端控制之行動機器,完成一套具有手勢操控即時性系統之設計目標。

    With the development of technology, the interaction between human and robot is gradually changing with primary design to be simple and intuitive. In recent years, the research of mobile robot has been widely discussed, and especially, the focusing on the mobile robots as the remote controller cars and as Unmanned Aerial Vehicle (UAV) are the most popular topics. One of the purposes of the mobile robots is their application in hazardous environments, including rescue, survey terrain, and tracking. In fact, the traditional technology usually applied remote controller and computer commands to control the mobile robots. In addition, for inexperienced users, it is difficult to handle the robot's direction, speed, and attitude. Consequently, this research has proposed a gesture controlled device with simple and easy operation to reduce the training time in exerting the remote controller.
    This thesis is divided into three parts; namely, somatosensory sensor, software interface, and recognition algorithm. The somatosensory sensor adopts Leap Motion that applies the infrared sensor and the CMOS camera to capture basic hand information like finger coordinate, velocity, direction, angle, and etc. With respect to the software interface, the Graphical User Interface (GUI) of the software “Visual Studio 2012” is developed with the C sharp (C#) platform. By combining the GUI with SDK provided by Leap Motion, gesture recognition is thus designed and displayed on the user interface platform. By using Back-Propagation Neural Network (BPNN) to learn and identify, the recognition algorithm is developed to recognize the hand gesture numbers 0, 1, 2, 3…9. Moreover, the hand features, including finger length and angle, were extracted. Then, with the right-hand and left-hand identification results, combinations of gestures made by controlling the mobile machines were obtained with the number up to 99 gestures.
    This thesis has therefore proposed development and design of the feature recognition system and instant gesture control system through integrating the features of Leap Motion hand detection, remote control robot and C sharp.

    摘要 I Abstract III 誌謝 V Contents VI List of Figures VIII List of Tables X CHAPTER 1 INTRODUCTION 1 1.1 Background 1 1.2 Motivation 2 1.3 Thesis Organization 3 CHAPTER 2 FUNDAMENTAL KNOWLEDGE 4 2.1 Gesture Recognition Algorithm 4 2.1.1 Artificial Neural Network 4 2.1.2 Neural Network of Architecture 6 2.2 Introduction of Leap Motion 8 2.2.1 Hardware Structure 8 2.2.2 Tracking Data 9 2.2.3 Sensor Image 12 CHAPTER 3 SYSTEM DESIGN AND ARCHITECTURE 14 3.1 System Architecture 14 3.2 Feature Extraction 16 3.2.1 Finger Position and Regionalization 16 3.2.2 Finger Length and Angle 18 3.3 Gesture Recognition Based on Neural Network Algorithm 20 3.3.1 Back-Propagation Neural Network Algorithm 20 3.3.2 Training of Back-Propagation Neural Network 27 3.3.3 Recognition of Back-Propagation Neural Network 32 CHAPTER 4 EXPERIMENTAL RESULTS 33 4.1 Numerical Simulation of Gesture Recognition 33 4.1.1 Right-Hand Gesture of Training Performance 35 4.1.2 Left-Hand Gesture of Training Performance 39 4.2 Software Implementation (C#) 42 4.3 Hardware Implementation of Robot 44 4.4 Implementation and Its Verification 46 CHAPTER 5 CONCLUSION AND FUTURE WORK 52 REFERENCES 54

    [1] T. Schlomer, B. Poppinga, N. Henze, S. Boll, “Gesture Recognition with a Wii Controller”, Proceedings of the 2nd international conference on Tangible and embedded interaction, pp.11-14, 2008.
    [2] K. Oka, Y. Sato, H. Koike, “Real-time fingertip tracking and gesture recognition”, IEEE Computer Graphics and Applications, vol.22, pp.64-71, 2002.
    [3] K.K Biswas, S.K Basu, “Gesture Recognition Using Microsoft Kinect®”, 2011 5th International Conference on Automation, Robotics and Applications (ICARA), pp.100-103, 2011.
    [4] Y. Hirose, K. Yamashita, S. Hijiya, “Backpropagation algorithm which varies the number of hidden units”, International Joint Conference on Neural Networks, 1989.
    [5] E.D. Karnin, “A simple procedure for pruning back-propagation trained neural networks”, IEEE Transactions on Neural Networks, vol. 1, pp.239-242, 1990.
    [6] J.E. Dayhoff, “Neural Network Architectures: An Introduction”, New York, N.Y. :Van Nostrand Reinhold, 1990.
    [7] R.A. Dilruba, N. Chowdhury, F.F. Liza and C.K. Karmakar, “Data Pattern Recognition with Back-Propagation Training”, Proc. of the 4th International Conference on Electrical and Computer Engineering, pp.451-455, December 2006.
    [8] H. Hasan, S. Abdul-Kareem, “Static hand gesture recognition using neural networks”, Springer Netherlands, vol.41, pp.147-181, 2012.
    [9] K. Symeonidis, “Hand Gesture Recognition Using Neural Networks”, United Nations International School, 2000
    [10] G. Marin, F. Dominio, P. Zanuttigh, “Hand Gesture recognition with leap motion and kinect devices”, 2014 IEEE 23rd International Symposium on Image Processing (ICIP), pp.1565-1569, 2014.
    [11] M. Mohandes, S. Aliyu and M. Deriche, “Arabic Sign Language Recognition using the Leap Motion Controller”, 2014 IEEE 23rd International Symposium on Industrial Electronics (ISIE), pp.960-965, 2014.
    [12] L.E Potter, J. Araullo, L. Carter, “The Leap Motion controller: A view on sign language”, Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, pp.175-178, 2013.
    [13] M. Nowicki , O. Pilarczyk, J. Wasikowski and K. Zjawin, “Gesture Recognition Library for Leap Motion Controller”, Poznan University of Technology Faculty of Computing Institute of Computing Science, 2014.
    [14] B.J. Hu, “Implementation of Neural Network and Its Application to Handwriting Recognition System Using Touch Panel”, National Cheng Kung University Press, 2011.
    [15] P.F. Chen, “Real-Time Detection of IC Character Based on Neural Network Algorithms”, National Cheng Kung University Press, 2012.
    [16] Leap Motion, https://www.leapmotion.com/
    [17] 李允中、王小璠、蘇木春,“模糊理論及其應用”,全華圖書股份有限公司,2008年.
    [18] 王進德,“類神經網路與模糊控制理論入門與應用”,全華圖書股份有限公司,2013年.
    [19] 葉怡成,“類神經網路模式應用與實作”,儒林圖書有限公司,2003年.

    下載圖示 校內:2020-07-31公開
    校外:2020-07-31公開
    QR CODE