| 研究生: |
葉中瑋 Yeh, Chung-Wei |
|---|---|
| 論文名稱: |
基於感測器之深度信念網路於手勢辨識的應用 A Sensor-based Gesture Recognition System Using Deep Belief Networks |
| 指導教授: |
胡敏君
Hu, Min-Chun |
| 學位類別: |
碩士 Master |
| 系所名稱: |
電機資訊學院 - 資訊工程學系 Department of Computer Science and Information Engineering |
| 論文出版年: | 2016 |
| 畢業學年度: | 104 |
| 語文別: | 英文 |
| 論文頁數: | 44 |
| 中文關鍵詞: | 手勢辨識 、深度信念網路 、支持向量機 、肌電訊號 、加速計 |
| 外文關鍵詞: | gesture recognition, deep belief networks, support vector machine, electromyography, accelerometer |
| 相關次數: | 點閱:108 下載:2 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本篇論文提出了一個基於Myo 臂戴式裝置的手勢辨識方法並將其應用於
籃球裁判訓練以及虛擬實境遊戲控制。一個Myo 臂戴式裝置由八個肌電訊
號感測器和一個慣性測量單位組成,因此可產生兩種異質的訊號來源(肌電
訊號與加速計訊號)。先前的研究通常是取統計上的數值來當作訊號特徵,
但如此簡單的特徵往往不足以表達複雜的動態手勢。因此,我們使用深度信
念網路得到更具代表性的訊號特徵,並結合時間域上的特徵和深度信念網路
所取出的特徵以得到更穩健的分類結果。比較了數個分類器之後,我們選擇
具有較佳姿勢分類表現的支持向量機做為分類模型。我們也建立了一個由
42 種手勢組成,包含4,620 筆手勢訊號的龐大Myo 手勢資料庫。就我們所知,這是第一個公開且妥善整理的Myo 手勢資料庫。我們同時也製作了一個圖形化使用者介面,讓佩戴著Myo 手環的使用者可以直接透過圖形化元件操作系統得到辨識結果,而不需要透過鍵盤輸入指令。我們也提出一個校正機制以提高辨識準確率,使用者在使用系統前僅需對每個手勢各完成一次動作序列輸入,系統便會針對此使用者重新訓練專屬的分類模型,得到更準確的手勢辨識結果。
In this paper, we propose a gesture recognition system based on Myo armband, which can be applied to the training of basketball referees and virtual reality (VR) game control. A Myo armband consists of 8-channel surface electromyography (sEMG) sensors and an Inertial Measurement Unit (IMU). Therefore, two kinds of heterogeneous signals, i.e., multi-channel sEMG and three-axis accelerometer (ACC) are generated and can be fused to achieve hand gesture recognition. Statistical features are commonly used in previous researches because they can be obtained with low computation complexity. However, such simple features are not enough to represent complex, dynamic gestures. In this work, Deep Belief Networks (DBNs) are trained to learn more representative features for gesture recognition. Time-domain features and DBN-based features are combined to achieve more
robust recognition results. After investigating and comparing different classification models, we found that Support Vector Machine (SVM) has the best classification
ability in our application. To evaluate the proposed method, a large Myo-based hand gesture dataset containing 4,620 sequential signals of 42 kinds of gestures is collected by ourselves. To the best of our knowledge, it is the first organized Myo-dataset and we release it to advance the progress of HCI technology. We also construct a graphical user interface (GUI) which enables users who wear Myo armband to directly manipulate the graphical elements on the panel, and then the hand gesture recognition process will be triggered to interact with the computer without typing
commands on the keyboard. A calibrated mechanism is introduced to enhance the recognition accuracy by letting the user perform each gesture once before using the system. The gesture models will be updated according to the user’s gesture signals to get more accurate results.
[1] Official basketball rules 2014. https://www.fiba.com/documents.
[2] M. R. Ahsan, M. I. Ibrahimy, and O. O. Khalifa. Electromygraphy (emg)
signal based hand gesture recognition using artificial neural network (ann).
In Mechatronics (ICOM), 2011 4th International Conference On, pages 1–6.
IEEE, 2011.
[3] D. Blana, T. Kyriacou, J. M. Lambrecht, and E. K. Chadwick. Feasibility of
using combined emg and kinematic signals for prosthesis control: A simulation
study using a virtual reality environment. Journal of Electromyography
and Kinesiology, 2015.
[4] C.-C. Chang and C.-J. Lin. LIBSVM: A library for support vector machines.
ACM Transactions on Intelligent Systems and Technology, 2:27:1–27:27,
2011. Software available at http://www.csie.ntu.edu.tw/~cjlin/
libsvm.
[5] C. Choi and J. Kim. A real-time emg-based assistive computer interface for
the upper limb disabled. In Rehabilitation Robotics, 2007. ICORR 2007. IEEE
10th International Conference on, pages 459–462. IEEE, 2007.
[6] M. Georgi, C. Amma, and T. Schultz. Recognizing hand and finger gestures
with imu based motion and emg based muscle activity sensing. In Proceedings
of the International Conference on Bio-inspired Systems and Signal Processing,
pages 99–108, 2015.
[7] G. E. Hinton, S. Osindero, and Y.-W. Teh. A fast learning algorithm for deep
belief nets. Neural computation, 18(7):1527–1554, 2006.
[8] X. Hu and V. Nenov. Multivariate ar modeling of electromyography for the
classification of upper arm movements. Clinical neurophysiology, 115(6):
1276–1287, 2004.
[9] M. Huanhuan and Z. Yue. Classification of electrocardiogram signals with
deep belief networks. In Computational Science and Engineering (CSE), 2014
IEEE 17th International Conference on, pages 7–12. IEEE, 2014.
[10] M. A. Keyvanrad and M. M. Homayounpour. A brief survey on deep belief
networks and introducing a new object oriented matlab toolbox (deebnet v2.
1). arXiv preprint arXiv:1408.3264, 2014.
[11] J. Kim, S. Mastnik, and E. André. Emg-based hand gesture recognition for
realtime biosignal interfacing. In Proceedings of the 13th international conference
on Intelligent user interfaces, pages 30–39. ACM, 2008.
[12] Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner. Gradient-based learning applied
to document recognition. Proceedings of the IEEE, 86(11):2278–2324,
1998.
[13] M. Lichman. UCI machine learning repository, 2013.
[14] E. H. S. D. Manimegalai. Hand gesture recognition based on emg signals using
ann.
[15] S. A. Raurale. Acquisition and processing real-time emg signals for prosthesis
active hand movements. In Green Computing Communication and Electrical
Engineering (ICGCCEE), 2014 International Conference on, pages 1–6.
IEEE, 2014.
[16] M. Rossi, S. Benatti, E. Farella, and L. Benini. Hybrid emg classifier based on
hmm and svm for hand gesture recognition in prosthetics. In Industrial Technology
(ICIT), 2015 IEEE International Conference on, pages 1700–1705.
IEEE, 2015.
[17] A.-A. Samadani and D. Kulic. Hand gesture recognition based on surface
electromyography. In Engineering in Medicine and Biology Society (EMBC),
2014 36th Annual International Conference of the IEEE, pages 4196–4199.
IEEE, 2014.
[18] T. S. Saponas, D. S. Tan, D. Morris, and R. Balakrishnan. Demonstrating the
feasibility of using forearm electromyography for muscle-computer interfaces.
In Proceedings of the SIGCHI Conference on Human Factors in Computing
Systems, pages 515–524. ACM, 2008.
[19] H.-m. Shim and S. Lee. Multi-channel electromyography pattern classification
using deep belief networks for enhanced user experience. Journal of Central
South University, 22(5):1801–1808, 2015.
[20] Thalmic. Myo. https://www.myo.com/.
[21] M. T. Wolf, C. Assad, A. Stoica, K. You, H. Jethani, M. T. Vernacchia,
J. Fromm, and Y. Iwashita. Decoding static and dynamic arm and hand gestures
from the jpl biosleeve. In Aerospace Conference, 2013 IEEE, pages 1–9.
IEEE, 2013.
[22] A. J. Young, L. H. Smith, E. J. Rouse, and L. J. Hargrove. Classification of
simultaneous movements using surface emg pattern recognition. Biomedical
Engineering, IEEE Transactions on, 60(5):1250–1258, 2013.
[23] D. Zhang, A. Xiong, X. Zhao, and J. Han. Pca and lda for emg-based control
of bionic mechanical hand. In Information and Automation (ICIA), 2012
International Conference on, pages 960–965. IEEE, 2012.
[24] X. Zhang, X. Chen, Y. Li, V. Lantz, K. Wang, and J. Yang. A framework for
hand gesture recognition based on accelerometer and emg sensors. Systems,
Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on,
41(6):1064–1076, 2011.
[25] W.-L. Zheng, J.-Y. Zhu, Y. Peng, and B.-L. Lu. Eeg-based emotion classification
using deep belief networks. In Multimedia and Expo (ICME), 2014 IEEE
International Conference on, pages 1–6. IEEE, 2014.