簡易檢索 / 詳目顯示

研究生: 李仁雲
Li, Ren-Yun
論文名稱: 以智慧傳感網路為基礎之太極拳訓練
Learning Tai-Chi Chun through The Body Sensor Network
指導教授: 藍崑展
Lan, Kun-Chan
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 資訊工程學系
Department of Computer Science and Information Engineering
論文出版年: 2019
畢業學年度: 108
語文別: 英文
論文頁數: 278
中文關鍵詞: 動作辨識深度學習穿戴式裝置太極
外文關鍵詞: Motion recognition, deep learning, wearable devices, Tai-Chi
相關次數: 點閱:169下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 太極拳是一項在東方流傳已久的運動。它由一系列柔和的動作組成,這些動作可以增強和放鬆身心。老年人可以通過訓練太極拳改善運動功能與心律變異度。
    通常,初學者在訓練過程中會遇到困難,因為他們不知道不同的太極拳形式的重點。在訓練太極拳的過程中,整個身體的平衡與協調並不容易。在太極拳學習過程中,學習者需要學習身體各個部位的重點,例如身體重心的變化。所有的招式變化均源自中心。就太極拳而言,腰部是最重要的身體部位。手部動作的變化和腳部動作的變化來自腰部的控制。除了著重於腰部外,學習者還需要放鬆和放鬆所有關節,尤其是肩膀和肘部,然後下沉,以使其靈活,連接並能夠融入適當的結構。太極拳的發勁是由腳和腿的運動引起的,它將通過控制腰部而影響背部和手臂的運動。
    在本文的貢獻中,我們提供了一種方法來幫助初學者學習太極拳。該方法是使用者透過身體傳感器網路學習太極拳。使用者可以容易地穿戴可穿戴設備。我們透過手機的藍牙記錄可穿戴設備收集的動作信息。太極拳的學習者可以通過身體傳感器網路設備收集運動數據。為了識別動作,使用者將數據放入已訓練的模型中。使用者可以透過此資訊來改善太極拳的動作。我們的運動識別精度為96.5%。

    Tai-Chi Chun is a sport that has long been circulating in the East. It consists of a series of soft movements that can strengthen and relax the human body and mind. Older people can improve muscle function and heart rate variability by training Tai-Chi Chun.
    Generally, beginners have difficulties during the training process because they cannot know the key points of different Tai-Chi Chun forms. In the process of training Tai-Chi Chun, the balance and coordination of the whole body is not easy. During the process of Tai-Chi Chun learning, learners need to learn the key points of each part, such as the change of the center of gravity of the body. All changes of the form originate from the center. The waist is the most important body part in terms of Tai-Chi Chun. The changes in the movements of the hands and the changes in the movements of the feet come from the control of the waist. In addition to the focus on the waist, the learners need to relax and loosen all the joints, particularly the shoulders and elbows, and sink them so that they are flexible, connected and are able to integrate into proper structure. The discharging force of Tai-Chi Chun results from the movement of feet and legs, and it will influence the motion of back and arms through the control of the waist.
    In the contribution of this article, we provide a method to help beginners learn Tai-Chi Chun. The method is that the user learns Tai-Chi Chun through the body sensor network. The user can easily wear the wearable device. We record the action information collected by the wearable device with Bluetooth through the mobile phone. Tai-Chi Chun learners can collect movement data through the body sensor network device. To identify the action, user puts the data into the trained model. Users can improve the motion of Tai-Chi Chun by this information. Our accuracy of motion recognition is 96.5%.

    摘要 i Abstract ii 致謝 iii Contents iv List of Table vi List of Figure vii Chapter 1 Introduction 1 Chapter 2 Related work 4 2.1 Data segmentation 4 2.1.1 Sliding window based approach 4 2.2 Motion recognition 5 2.2.1 Sensor based approach 5 2.2.2 Motion recognition 6 2.3 Sport training 10 Chapter 3 Method 15 3.1 System Architecture 15 3.2 Body sensor network 16 3.2.1 Placement of Device 16 3.2.2 Sensor & MCU 20 3.3 Smartphone 36 3.3.1 Environment 36 3.3.2 Data Records 37 3.3.3 Data collect 38 3.3.4 Recording video 44 3.4 Desktop 45 3.4.1 Environment 45 3.4.2 Data labelling 45 3.4.3 Data segmentation 46 3.4.4 DNN model for motion classification 47 3.4.5 From dense labeling to Tai-Chi Chun form recognition 56 Chapter 4 Experiment Result 59 4.1 Experiment design 59 4.1.1 Data collection 59 4.1.2 Experimental design 60 4.2 Deep Learning Parameter comparison 61 4.2.1 Different model comparison 61 4.2.2 Different subsequence lengths of U net model comparison 64 4.2.3 Different subset of U net model comparison 65 4.2.4 Short prediction of U net model comparison 75 4.2.5 Different feature of U net model comparison 78 4.3 Best parameter in comparison result 79 Chapter 5 Application System 81 5.1 Application System workflow 81 5.1.1 Body sensor network 81 5.1.2 Smartphone 82 5.1.3 Desktop 85 Chapter 6 Conclusion & Future Work 88 References 92 Appendix 100 1. Arduino code 100 2. Android studio code 102 3. Data labeling code 195 4. Deep learning code 240 5. Application System code 263

    [1] M.Qi, W.Moyle, C.Jones, andB.Weeks, “Tai Chi Combined With Resistance Training for Adults Aged 50 Years and Older,” J. Geriatr. Phys. Ther., p. 1, 2018, doi: 10.1519/jpt.0000000000000218.
    [2] L.Qin, W.Choy, K.Leung, P. C.Leung, S.Au, W.Hung, M.Dambacher, andK.Chan, “Beneficial effects of regular Tai Chi exercise on musculoskeletal system,” J. Bone Miner. Metab., vol. 23, no. 2, pp. 186–190, 2005, doi: 10.1007/s00774-004-0559-2.
    [3] G.Zheng, S.Li, M.Huang, F.Liu, J.Tao, andL.Chen, “The effect of Tai Chi training on cardiorespiratory fitness in healthy adults: A systematic review and meta-analysis,” PLoS One, vol. 10, no. 2, pp. 1–20, 2015, doi: 10.1371/journal.pone.0117360.
    [4] J.Liu, H.Xie, M.Liu, Z.Wang, L.Zou, A. S.Yeung, S. S. C.Hui, andQ.Yang, “The effects of tai chi on heart rate variability in older chinese individuals with depression,” Int. J. Environ. Res. Public Health, vol. 15, no. 12, 2018, doi: 10.3390/ijerph15122771.
    [5] P. T.Chua, R.Crivella, B.Daly, N.Hu, R.Schaaf, D.Ventura, T.Camill, J.Hodgins, andR.Pausch, “Training for physical tasks in virtual environments: Tai Chi,” Proc. - IEEE Virtual Real., vol. 2003-Janua, pp. 87–94, 2003, doi: 10.1109/VR.2003.1191125.
    [6] P. D.Wang, Ai-Ting, Shun-Hwa Wei, “Develop Tai-Chi Based Interactive Computer Assessment and Rehabilitation System for Elderly Fall Prevention,” Natl. Yang-Ming Univ. Master Thesis, 2011.
    [7] F. J.Ordóñez andD.Roggen, “Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition,” Sensors (Switzerland), vol. 16, no. 1, 2016, doi: 10.3390/s16010115.
    [8] O. S.Eyobu, Y. W.Kim, D.Cha, andD. S.Han, “A real-time sleeping position recognition system using IMU sensor motion data,” 2018 IEEE Int. Conf. Consum. Electron. ICCE 2018, vol. 2018-Janua, pp. 1–2, 2018, doi: 10.1109/ICCE.2018.8326209.
    [9] P.Zhu, H.Zhou, S.Cao, P.Yang, andS.Xue, “Control with gestures: A hand gesture recognition system using off-the-shelf smartwatch,” Proc. - 2018 4th Int. Conf. Big Data Comput. Commun. BIGCOM 2018, pp. 72–77, 2018, doi: 10.1109/BIGCOM.2018.00018.
    [10] P.Paul andT.George, “An effective approach for human activity recognition on smartphone,” ICETECH 2015 - 2015 IEEE Int. Conf. Eng. Technol., no. March, pp. 1–3, 2015, doi: 10.1109/ICETECH.2015.7275024.
    [11] C. W.Ngo, K.Schoeffmann, Y.Andreopoulos, andC.Breiteneder, “A Sensor-Based Official Basketball Referee Signals Recognition System Using Deep Belief Networks,” A Sensor-Based Off. Basketb. Ref. Signals Recognit. Syst. Using Deep Belief Networks Chung-Wei, vol. 71, no. 1, pp. 331–332, 2014, doi: 10.1007/s11042-013-1775-3.
    [12] P.Lubina andM.Rudzki, “Artificial neural networks in accelerometer-based human activity recognition,” Proc. 22nd Int. Conf. Mix. Des. Integr. Circuits Syst. Mix. 2015, pp. 63–68, 2015, doi: 10.1109/MIXDES.2015.7208482.
    [13] J. Y.Yang, J. S.Wang, andY. P.Chen, “Using acceleration measurements for activity recognition: An effective learning algorithm for constructing neural classifiers,” Pattern Recognit. Lett., vol. 29, no. 16, pp. 2213–2220, 2008, doi: 10.1016/j.patrec.2008.08.002.
    [14] W.Walke, Z.Paszenda, M.Basiaga, P.Karasiński, andM.Kaczmarek, Activity monitoring of the elderly for telecare systems - review, vol. 284. 2014.
    [15] F.Attal, S.Mohammed, M.Dedabrishvili, F.Chamroukhi, L.Oukhellou, andY.Amirat, “Physical human activity recognition using wearable sensors,” Sensors (Switzerland), vol. 15, no. 12, pp. 31314–31338, 2015, doi: 10.3390/s151229858.
    [16] Q.Zhu, Z.Chen, andC. S.Yeng, “A Novel Semi-supervised Deep Learning Method for Human Activity Recognition,” IEEE Trans. Ind. Informatics, vol. PP, no. c, p. 1, 2018, doi: 10.1109/TII.2018.2889315.
    [17] N. Y.Hammerla, S.Halloran, andT.Plötz, “Deep, convolutional, and recurrent models for human activity recognition using wearables,” IJCAI Int. Jt. Conf. Artif. Intell., vol. 2016-Janua, pp. 1533–1540, 2016.
    [18] M. S.Kang, H. W.Kang, C.Lee, andK.Moon, “The gesture recognition technology based on IMU sensor for personal active spinning,” Int. Conf. Adv. Commun. Technol. ICACT, vol. 2018-Febru, pp. 546–552, 2018, doi: 10.23919/ICACT.2018.8323826.
    [19] F.Li, K.Shirahama, M. A.Nisar, L.Köping, andM.Grzegorzek, “Comparison of feature learning methods for human activity recognition using wearable sensors,” Sensors (Switzerland), vol. 18, no. 2, pp. 1–22, 2018, doi: 10.3390/s18020679.
    [20] T.Hayashi, M.Nishida, N.Kitaoka, andK.Takeda, “Daily activity recognition based on DNN using environmental sound and acceleration signals,” 2015 23rd Eur. Signal Process. Conf. EUSIPCO 2015, pp. 2306–2310, 2015, doi: 10.1109/EUSIPCO.2015.7362796.
    [21] K.Altun andB.Barshan, “Human Activity Recognition Using Inertial/Magnetic Sensor Units,” UbiComp 2002 Ubiquitous Comput., vol. 6219, no. Chapter 5, pp. 38–51, 2010, [Online]. Available: http://link.springer.com/10.1007/978-3-642-14715-9_5%0Apapers3://publication/doi/10.1007/978-3-642-14715-9_5.
    [22] R.Yao, G.Lin, Q.Shi, andD. C.Ranasinghe, “Efficient dense labelling of human activity sequences from wearables using fully convolutional networks,” Pattern Recognit., vol. 78, pp. 252–266, 2018, doi: 10.1016/j.patcog.2017.12.024.
    [23] Y.Zhang, Y.Zhang, Z.Zhang, J.Bao, andY.Song, “Human activity recognition based on time series analysis using U-Net,” 2018, [Online]. Available: http://arxiv.org/abs/1809.08113.
    [24] M. A.Alsheikh, A.Selim, D.Niyato, L.Doyle, S.Lin, andH.Tan, “Deep Activity Recognition Models with Triaxial Accelerometers,” pp. 8–13, 2013.
    [25] O.Dehzangi andV.Sahu, “IMU-Based Robust Human Activity Recognition using Feature Analysis, Extraction, and Reduction,” Proc. - Int. Conf. Pattern Recognit., vol. 2018-Augus, pp. 1402–1407, 2018, doi: 10.1109/ICPR.2018.8546311.
    [26] L. N. N.Nguyen, D.Rodríguez-Martín, A.Català, C.Pérez-López, A.Samà, andA.Cavallaro, “Basketball activity recognition using wearable inertial measurement units,” ACM Int. Conf. Proceeding Ser., vol. 07-09-Sept, 2015, doi: 10.1145/2829875.2829930.
    [27] J. M.Chaquet, E. J.Carmona, andA.Fernández-Caballero, “A survey of video datasets for human action and activity recognition,” Comput. Vis. Image Underst., vol. 117, no. 6, pp. 633–659, 2013, doi: 10.1016/j.cviu.2013.01.013.
    [28] S.-R.Ke, H.Thuc, Y.-J.Lee, J.-N.Hwang, J.-H.Yoo, andK.-H.Choi, A Review on Video-Based Human Activity Recognition, vol. 2, no. 2. 2013.
    [29] C.Gu, C.Sun, D. A.Ross, C.Vondrick, C.Pantofaru, Y.Li, S.Vijayanarasimhan, G.Toderici, S.Ricco, R.Sukthankar, C.Schmid, andJ.Malik, “AVA: A Video Dataset of Spatio-Temporally Localized Atomic Visual Actions,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., pp. 6047–6056, 2018, doi: 10.1109/CVPR.2018.00633.
    [30] V.Kalogeiton, P.Weinzaepfel, V.Ferrari, andC.Schmid, “Action Tubelet Detector for Spatio-Temporal Action Localization,” Proc. IEEE Int. Conf. Comput. Vis., vol. 2017-Octob, pp. 4415–4423, 2017, doi: 10.1109/ICCV.2017.472.
    [31] L.Yu, J.Shao, X. S.Xu, andH. T.Shen, “Activity Recognition using Cell Phone Accelerometers,” Multimed. Tools Appl., vol. 74, no. 2, pp. 505–521, 2014, doi: 10.1007/s11042-014-2010-6.
    [32] P.Lukowicz, J. A.Ward, H.Junker, M.Stäger, G.Tröster, A.Atrash, andT.Starner, “Recognizing workshop activity using body worn microphones and accelerometers,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 3001, no. January, pp. 18–32, 2004, doi: 10.1007/978-3-540-24646-6_2.
    [33] S. O.Shin, D.Kim, andY. H.Seo, “Controlling mobile robot using IMU and EMG sensor-based gesture recognition,” Proc. - 2014 9th Int. Conf. Broadband Wirel. Comput. Commun. Appl. BWCCA 2014, pp. 554–557, 2014, doi: 10.1109/BWCCA.2014.145.
    [34] J.Wu, L.Sun, andR.Jafari, “A Wearable System for Recognizing American Sign Language in Real-Time Using IMU and Surface EMG Sensors,” IEEE J. Biomed. Heal. Informatics, vol. 20, no. 5, pp. 1281–1290, 2016, doi: 10.1109/JBHI.2016.2598302.
    [35] M.Zeng, L. T.Nguyen, B.Yu, O. J.Mengshoel, J.Zhu, P.Wu, andJ.Zhang, “Convolutional Neural Networks for Human Activity Recognition using Mobile Sensors,” vol. 6, 2014, doi: 10.4108/icst.mobicase.2014.257786.
    [36] R.Yao, G.Lin, Q.Shi, andD. C.Ranasinghe, “Efficient dense labelling of human activity sequences from wearables using fully convolutional networks,” Pattern Recognit., vol. 78, pp. 252–266, 2018, doi: 10.1016/j.patcog.2017.12.024.
    [37] C.Zhu andW.Sheng, “Human daily activity recognition in robot-assisted living using multi-sensor fusion,” Proc. - IEEE Int. Conf. Robot. Autom., pp. 2154–2159, 2009, doi: 10.1109/ROBOT.2009.5152756.
    [38] S. A.Rokni andH.Ghasemzadeh, “Autonomous Training of Activity Recognition Algorithms in Mobile Sensors: A Transfer Learning Approach in Context-Invariant Views,” IEEE Trans. Mob. Comput., vol. 17, no. 8, pp. 1764–1777, 2018, doi: 10.1109/TMC.2018.2789890.
    [39] Jing Yang, Eun-Seok Choi, Wook Chang, Won-Chui Bang, Sung-Jung Cho, Jong-Koo Oh, Joon-Kee Cho, andDong-Yoon Kim, “A novel hand gesture input device based on inertial sensing technique,” 30th Annu. Conf. IEEE Ind. Electron. Soc. 2004. IECON 2004, vol. 3, pp. 2786–2791, 2005, doi: 10.1109/iecon.2004.1432249.
    [40] S.Brault, B.Bideau, andR.Kulpa, “How the global body displacement of a rugby player can be used to detect deceptive movement in 1 vs. 1,” Proc. 11th Virtual Real. Conf. Laval, no. April 2014, pp. 161–166, 2009.
    [41] S.Xu, P.Song, C. L.Chin, G. G.Chua, Z.Huang, andS.Rahardja, “Tennis space: An interactive and immersive environment for tennis simulation,” Proc. 5th Int. Conf. Image Graph. ICIG 2009, no. September 2009, pp. 652–657, 2010, doi: 10.1109/ICIG.2009.102.
    [42] O.Korn, M.Brach, K.Hauer, andS.Unkauf, “Exergames for Elderly Persons: Physical Exercise Software Based on Motion Tracking within the Framework of Ambient Assisted Living Oliver,” Serious Games Virtual Worlds Educ. Prof. Dev. Healthc., no. March 2016, pp. 258–268, 2013, doi: 10.4018/978-1-4666-3673-6.ch016.
    [43] “K-MOTION | K-Baseball.” https://www.k-motion.com/k-coach/k-baseball/ (accessed Jan. 10, 2020).
    [44] Y.-P. H.Chen-Hsin Hsieh, “Full-Body Movement Guidance For Learning Tai Chi Chuan With A Video See-Through Head-Mounted Display.,” Natl. Taiwan Univ. Master Thesis, 2015.
    [45] P. H.Han, Y. S.Chen, Y.Zhong, H. L.Wang, andY. P.Hung, “My Tai-Chi coaches: An augmented-learning tool for practicing Tai-Chi Chuan,” ACM Int. Conf. Proceeding Ser., pp. 3–6, 2017, doi: 10.1145/3041164.3041194.
    [46] Y.-P. H.Ru-Han Wu, “Gesture-mediated Multimedia Player for Tai Chi Chuan Instruction,” Natl. Taiwan Univ. Master Thesis, 2014.
    [47] Y. C.Chen, P. Y.Kao, K. Y.Lu, S. Y.Wei, andY. P.Hung, “Pressure sensing insoles for learning Tai-Chi Chuan,” Proc. Sci., vol. 18-19-Dece, pp. 1–8, 2015, doi: 10.22323/1.264.0048.
    [48] B.Knoerlein, G.Székely, andM.Harders, “Visuo-haptic collaborative augmented reality ping-pong,” p. 91, 2007, doi: 10.1145/1255047.1255065.
    [49] S.Brault, B.Bideau, C.Craig, andR.Kulpa, “Balancing deceit and disguise: How to successfully fool the defender in a 1 vs. 1 situation in rugby,” Hum. Mov. Sci., vol. 29, no. 3, pp. 412–425, 2010, doi: 10.1016/j.humov.2009.12.004.
    [50] a.Siemon, R.Wegener, F.Bader, T.Hieber, andU.Schmid, “Video Games can Improve Performance in Sports. An Empirical Study with Wii Sports Bowling.,” KI 2009 Proc. 32nd Annu. Conf. Artif. Intell., no. September, p. 74, 2009.
    [51] H.Ghasemzadeh, V.Loseu, E.Guenterberg, andR.Jafari, “Sport training using body sensor networks: a statistical approach to measure wrist rotation for golf swing,” 2012, doi: 10.4108/icst.bodynets2009.6035.
    [52] T. N.Sainath, O.Vinyals, A.Senior, andH.Sak, “Convolutional, Long Short-Term Memory, fully connected Deep Neural Networks,” ICASSP, IEEE Int. Conf. Acoust. Speech Signal Process. - Proc., vol. 2015-Augus, pp. 4580–4584, 2015, doi: 10.1109/ICASSP.2015.7178838.
    [53] O.Ronneberger, P.Fischer, andT.Brox, “U-net: Convolutional networks for biomedical image segmentation,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 9351, pp. 234–241, 2015, doi: 10.1007/978-3-319-24574-4_28.
    [54] S.Bach, A.Binder, G.Montavon, F.Klauschen, K. R.Müller, andW.Samek, “On pixel-wise explanations for non-linear classifier decisions by layer-wise relevance propagation,” PLoS One, vol. 10, no. 7, Jul.2015, doi: 10.1371/journal.pone.0130140.
    [55] B.Logan andA.Salomon, “A music similarity function based on signal analysis,” Proc. - IEEE Int. Conf. Multimed. Expo, no. June 2001, pp. 745–748, 2001, doi: 10.1109/ICME.2001.1237829.

    下載圖示 校內:2025-07-31公開
    校外:2025-07-31公開
    QR CODE