簡易檢索 / 詳目顯示

研究生: 蔡坤佑
Tsai, Kun-Yu
論文名稱: 行動裝置上之手語翻譯系統
A Sign Language Translation System on Mobile Devices
指導教授: 王宗一
Wang, Tzone-I
學位類別: 碩士
Master
系所名稱: 工學院 - 工程科學系
Department of Engineering Science
論文出版年: 2016
畢業學年度: 104
語文別: 中文
論文頁數: 51
中文關鍵詞: 影像辨識手語翻譯行動裝置
外文關鍵詞: Image Recognition, Sign Language Translation, Mobile Devices
相關次數: 點閱:132下載:2
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 手語為聽障朋友常用的溝通語言,透過雙手的手形、動作、位置等資訊來建構語意及語句,然而一般大眾對於手語的學習並不多,與聽障朋友較難以直接進行溝通,隨著資訊科技之進步,尤其是影像處理及行動通訊裝置之進步,要設計建置一套隨身手語翻譯系統以讓聽障朋友與一般大眾容易進行溝通,應該是可行的。
    本研究設計一套能讓一般民眾與聽障朋友進行手語溝通的手語翻譯系統。研究採用行動裝置做為運算中心,以其相機擷取手語影像,先透過人臉偵測取得膚色樣本以進行膚色偵測,再運用影像處理方法擷取出同膚色的雙手手掌,並計算手部運動資訊來組成手語語意。運動資訊包括手形、移動方向及位置三種資訊,手形是使用Features from Accelerated Segment Test (FAST)偵測特徵點與Binary Robust Independent Elementary Features (BRIEF)特徵描述演算法及計算Hamming距離以和影像庫比較最相似手形;移動方向則以手部重心點運動軌跡之兩相鄰點來計算角度得到移動方向;並比較雙手手部位置與所設定之門檻的高低以決定正確之手部位置。本系統目前手形辨識率可達91.1%,而每張影像運算時間為90.59(ms),經過實際場景測試,已經可以讓一般民眾與聽障朋友互相順利溝通,且語意資料庫建可以隨時擴充各種單字,包括單詞與複合詞。

    Sign language is used by hearing-impaired people to communicate with; through the hand sign gestures and body motions to convey the information to other disabled people. But normal people usually do not learn the sign language, thus are unable to understand what the disabled people say. With the advance of information technology, especially the image recognition and processing technology, mobile communication hardware and software, together should be able to build a sign language translation system that enable normal people to communicate with disabled people using sign language. The main purpose of this study is to design and implement a mobile-device-based translation system that, using a mobile device, such as a smart phone, takes video of sign language form disabled people, translates them into texts for normal people, and shows texts typed by normal people to disabled ones, repeating this process to finish the communication. The system is targeted for daily life usages, such as that in a hospital or a bank.
    The techniques involve capturing sign language images from a mobile device, then making use of a sequence of image processing to detect hand regions, and recognizing the information of hand gestures, in three parts. The first part is to get the information of hand shapes, by using Features from Accelerated Segment Test (FAST) and Binary Robust Independent Elementary Features (BRIEF) algorithms to detect interested points and generate characteristic values. Through calculating Hamming distance and comparing it to the image base to find the most similar hand shape. The second part is to figure out the directions of both hands’ movements by calculating the angles between two successively captures of the hands’ image center points. The third part is to get both hand positions by comparing their center points with a threshold point to decide if a hand is in the low or high position. The recognition precision of the sign language translation system, from experiments, can reach 91.1% and the average time for recognizing a hand shape is 90.59ms.

    摘要 I EXTENDED ABSTRACT II 致謝 VIII 目錄 IX 表目錄 XII 圖目錄 XIII 第一章 緒論 1 1.1 研究背景與動機 1 1.2 研究目的 2 1.3 論文架構 2 第二章 文獻回顧 3 2.1 台灣手語簡介 3 2.2 手語辨識相關文獻 5 2.2.1 資料手套方法 5 2.2.2 影像辨識方法 6 第三章 研究方法與系統流程 11 3.1 系統架構 11 3.2 手部偵測 12 3.2.1 RGB與HSV色彩空間 12 3.2.2 膚色偵測 14 3.2.2.1 人臉偵測 14 3.2.2.2 膚色樣本擷取 15 3.2.2.3 膚色直方圖 16 3.2.2.4 膚色反向投影 18 3.2.3 形態學 20 3.2.3.1 侵蝕運算(Erosion) 20 3.2.3.2 膨漲運算(Dilation) 20 3.2.3.3 閉合運算(Closing) 21 3.2.3.4 斷開運算(Opening) 22 3.2.4 區域搜尋 22 3.2.5 手部偵測結果與流程 23 3.3 手部辨識 25 3.3.1 手形辨識 25 3.3.1.1 手勢資料庫建立 27 3.3.1.2 FAST特徵點偵測 27 3.3.1.3 BRIEF特徵點描述子 29 3.3.1.4 Hamming Distance 32 3.3.1.5 手形辨識流程 32 3.3.2 手部移動方向辨識 33 3.3.2.1 運動軌跡特徵擷取 33 3.3.2.2 移動方向辨識流程 34 3.3.3 手部位置判斷 35 3.4 語意辨識 36 3.4.1 語意判斷流程 36 3.4.2 手語單字資訊 38 第四章 實驗結果與分析 40 4.1 實驗環境 40 4.2 膚色偵測實驗 41 4.3 手形辨識實驗 42 4.4 語意辨識實驗 44 4.5 銀行服務應用系統 45 第五章 結論與建議 48 5.1 結論 48 5.2 建議 48 參考文獻 49

    [1] 資策會FIND/經濟部技術處「資策會FIND(2015)/服務創新體驗設計系統研究與推動計畫(3/4)」,
    http://www.iii.org.tw/Press/NewsDtl.aspx?nsp_sqno=1560&fm_sqno=14(2015/7)
    [2] 衛生福利部統計處,全國身心障礙人口統計,2016
    http://www.mohw.gov.tw/cht/DOS/Statistic.aspx?f_list_no=312&fod_list_no=4198
    [3] 全國特殊教育資訊網,常用手語辭典APP,http://www.spc.ntnu.edu.tw/site/c_sign_language/index/data/czNKdkVha1R4WDVLSXc2eThDcTJFNHFQejYreEo4RnhJTlNyWXJBSDZnYjhmMXFIaDhUUEV6cGE2aDM4TThodXBoWT0
    [4] 中華民國聾人協會,會說話的手,太乙文物出版社,1991。
    [5] B. Fang, D. Guo, F. Sun, H. Liu, T. Wu, “A robotic hand-arm teleoperation system using human arm/hand with a novel data glove”, IEEE International Conference on Robotics and Biomimetics, pp. 2483-2488, 2015.
    [6] G. C. Luh, H. A. Lin, Y. H. Ma, C. J. Yen, “Intuitive muscle-gesture based robot navigation control using wearable gesture armband”, IEEE International Conference on Machine Learning and Cybernetics, pp. 389-395, 2015
    [7] R. Senanayake, S. Kumarawadu. “A robust vision-based hand gesture recognition system for appliance control in smart homes”, IEEE International Conference on Signal Processing, Communication and Computing, pp. 760-763, 2012.
    [8] W. Trottier, L. Majeau, Y. El-Iraki, S. Loranger, G. Chabot-Nobert, J. Borduas, J. Lavoie, J. Lapointe, “Signal Processing for low cost optical dataglove”, IEEE International Conference on Information Science, Signal Processing and their Applications, pp. 501-504, 2012.
    [9] D. Bajpai, U. Porov, G. Srivastav, N. Sachan, “Two way wireless data communication and american sign language translator glove for images text and speech display on mobile phone”, IEEE International Conference on Communication Systems and Network Technologies, pp. 578-585, 2015.
    [10] J. Bukhari, M. Rehman, S. I. Malik, A. M. Kamboh, A. Salman, “American sign language translation through sensory glove; SignSpeak”, International Journal of u- and e- Service, Science and Technology, pp. 131-142, 2015.
    [11] L. J. Kau, W. L. Su, P. J. Yu, S. J. Wei, “A real-time portable sign language translation system”, IEEE MWSCAS, pp. 1-4, 2015.
    [12] H. Elleuch, A. Wali, A.Samet, A. M. Alimi, “A static hand gesture recognition system for real time mobile device monitoring”, IEEE International Conference on Intelligent Systems Design and Applications, 2015.
    [13] J. L. Raheja, A. Singhal, A. Chaudhary, “Android based Portable Hand Sign Recognition System”, Recent Trends in Hand Gesture Recognition, 2015.
    [14] T. J. Joshi, S. Kumar, N. Z. Tarapore, V. Mohile, “Static hand gesture recognition using an android device”, International Journal of Computer Applications, 2015.
    [15] J. S. Ochoa-Zambrano, V. E. Robles-Bykbaev, T. E. Flores-Tapia, J. Doutreloigne, ”A new hand gesture recognition approach for robotic assistants based on mobile devices”, International Journal of Electronics and Electrical Engineering, 2016.
    [16] S. P. More, A. Sattar, “Hand gesture recognition system using image processing”, International Conference on Electrical, Electronics, and Optimization Techniques, 2016.
    [17] Z. A. Ansari, G. Harit, “Nearest neighbour classification of Indian sign language gestures using kinect camera”, Academy Proceedings in Engineering Sciences, pp. 161-182, 2016.
    [18] K. Saipullah, N. A. Ishmail, A. Anuar, N. Sarimin, “Comparison of feature extractions for real-time object detection on android smartphone”, Journal of Theoretical and Applied Information Technology, 2013.
    [19] E. Rosten, T. Drummond, “Machine learning for high-speed corner detection”, ECCV, 2006.
    [20] S. B. Chaudhari, S. A. Patil, “Real time video processing and object detection on android smartphone”, International Conference on Electrical, Electronics, Signals, Communication and Optimization, 2015.
    [21] O. Miksik, K. Mikolajczyk, ”Evaluation of local detectors and descriptors for fast feature matching”, International Conference on Pattern Recognition, pp. 2681-2684, 2012.
    [22] M. Calonder, V. Lepetit, C. Strecha, P. Fua, “BRIEF: Binary robust independent elementary features”, In Proceedings of the European Conference on Computer Vision, 2010.
    [23] A. Patel, D. R. Kasat, S. Jain, V. M. Thakare, “Performance analysis of various feature detector and descriptor for real-time video based face tracking”, International Journal of Computer Applications, 2014.
    [24] S. Isik, K. Ozkan, “A comparative evaluation of well-known feature detectors and descriptors”, International Journal of Applied Mathematics, Electronics and Computers, 2014.
    [25] P. Viola, M. J. Jones, “Robust real-time face detection”, International Journal of Computer Vision, pp. 137-154, 2004.
    [26] R. Lienhart, J. Maydt, “An extended set of haar-like features for rapid object detection”, International Conference on Image Processing, 2002.
    [27] M. J. Swain, D. H. Ballard, “Indexing via color histogram”, International Conference on Computer Vision, 1990.

    無法下載圖示 校內:2021-09-01公開
    校外:不公開
    電子論文尚未授權公開,紙本請查館藏目錄
    QR CODE