簡易檢索 / 詳目顯示

研究生: 張銘祥
Chang, Ming-Shaung
論文名稱: 具友善人機介面之智慧型戶外清潔機器人
Intelligent Outdoor Clean Robot with Friendly Human-Robot Interface System
指導教授: 周榮華
Chou, Jung-Hua
學位類別: 博士
Doctor
系所名稱: 工學院 - 工程科學系
Department of Engineering Science
論文出版年: 2010
畢業學年度: 98
語文別: 英文
論文頁數: 110
中文關鍵詞: 戶外清潔機器人單影像三維測距全域最短路徑人機介面系統動態手勢辨識
外文關鍵詞: Outdoor clean robot, Single image 3D measurement model, Global-path planning, Human-robot interface system, Hand gesture recognition
相關次數: 點閱:214下載:9
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本論文研發一套具友善人機介面之智慧型戶外清潔機器人系統,包含在戶外環境下工作的清潔機器人設計與實現,和在未知環境下基於Floyd-A*演算法、平滑化全域最短路徑規劃與其三維工作環境地圖重建之自動導航系統研發,以及基於使用者動態連續手勢指令之遠端友善人機介面設計。
    為克服各種未知戶外清潔環境的干擾,例如崎嶇地形、落葉偵測或是天候變化等等因素,吾人首先使用履帶驅動設計來取代一般輪式驅動,賦與清潔機器人自主行走於不平路面之能力;而於工作環境之清潔區域偵測,則利用影像-超音波感測模組完成,其中演算流程包括外部光源補償、清潔區域之高斯顏色濾波與K-Means分類器去除背景、感測資料模糊化融合與二維清潔環境地圖建立。最後以特製清潔機構模組,在牛耕田全域式封閉路徑上清掃落葉,並依道路邊緣或落葉集中人工標記做機器人自我定位與垃圾傾倒。
    另一方面,為求得機器人在未知環境下起點-終點之全域最短路徑,系統先使用單影像三維測距模型計算場景中障礙物的實際位置,並依此結果建立三維工作環境地圖。在感測範圍內以Floyd-Warshall 演算法求得起點到區域子目標之最短子路徑,再依其路徑節點外切圓演算法平滑此路徑,最後以A*演算法計算適當之區域子目標與全域最短平滑路徑。
    此外,為提供遠端使用者友善的智慧型人機介面系統,全系統以使用者頭部位置與動態手勢辨識命令控制此清潔機器人。吾人先使用包含橢圓樣版、臉部特徵與主成分分析法相似度等三種方式偵測人臉及fuzzy-Kalman方法追蹤人臉,並作手部偵測與類神經-主成分分析法辨識,再依手指指示方向資訊和動態規劃法求得之連續指令控制整個清潔機器人系統。
    本研究以模擬及實際環境下之實驗驗證所提方法之通用性,結果證實所開發之系統快速且準確地達成其目標。

    In this dissertation, an intelligent outdoor-clean robot system with friendly human-robot interface (HRI) is developed. The system consists of three main topics, including the design of unmanned clean robots, the search of optimal global-path based on Floyd-A* algorithm in the unknown environments as well as 3D map-building from 2D scene images, and the implementation of the friendly HRI.
    Due to the internal and external disturbances in various unknown outdoor-environments, e.g. rough terrain, leaf category or varying weather, novel designs are implemented in this dissertation to overcome the above-mentioned issues. Firstly, the track motion mechanism is adopted instead of wheels. Secondly, both image and ultrasonic sensors are used to detect cleaning-regions in the working environment. Algorithms such as white balance for varying light source, Gaussian color-filter and K-means clustering method for background segmentation, and fuzzy data-fusion as well as 2D cleaning-map generation are used in this procedure. Thirdly, the robot cleans these leaves in a closed cleaning region with a special cleaning-brush mechanism based on boustrophedon path planning. Finally, the robot localizes and damps trash autonomously by road-edge detection and garbage-collection mark recognition, respectively.
    Besides, we also investigate the fast and safe global-path planning algorithms from the starting point to a certain goal in the unknown environments for robot navigation, by using only a single image 3D measurement model to orientate the 3D obstacles in the 2D images. A 3D working-map in the unknown environment is also built accordingly. The shortest sub-path candidates in the sensing range are computed and smoothed based on Floyd Algorithm and excircle calculation of path-nodes, respectively. The optimal global-path to the goal in the unknown environment is then determined by the A* Algorithm.
    Furthermore, we establish a friendly and intelligent HRI system for the distant users, based only on their facial positions and natural dynamic gestures to control the robot. Face detection methods, including ellipse shape, facial feature and PCA similarity detection, and fuzzy-Kalman face tracking, are implemented first. Then after dynamic programming, the real-time hand gesture recognition via the PCA-BPANN classifier and finger-pointing detection helps users instruct the robot to perform its tasks.
    Both simulations and experiments in the real environments are conducted to prove the feasibility of this dissertation. The results show that our clean robot with the proposed HRI performs fast and accurately.

    Contents Abstract (English)…… I Abstract (Chinese).. III Acknowledgement (Chinese)………………. IV Contents… V List of Tables VIII List of Figures VIII Chapter 1 Introduction 1 1.1 Motivation 1 1.2 Literature Review 2 1.3 Key Contributions of this Study 5 1.4 Dissertation Organization 7 Chapter 2 Outdoor Clean Robot Design Ι 9 2.1 System Skeleton 9 2.1.1 Hardware Framework 9 2.1.2 Cleaning Flowchart 10 2.2 Sensor Modules and Data Fusion 10 2.2.1 Ultrasonic Measurement Module 10 2.2.2 Single Image Measurement Model 11 2.2.3 Power Monitor Module 12 2.3 Clean-region Detection 13 2.4 Boustrophedon Complete Coverage Path-planning and Self-localization 15 2.4.1 2D Image to Grid-cell Map 16 2.4.2 Boustrophedon Complete Coverage Path Planning 16 2.4.3 Self-localization 16 2.5 Results and Discussions 17 Chapter 3 Outdoor Clean Robot Design ΙΙ 19 3.1 System Skeleton 19 3.1.1 Hardware Framework 19 3.1.2 Cleaning Flowchart 20 3.2 Dual Sensor Module and Cleaning Region Detection 21 3.2.1 Dual Sensor Module 21 3.2.2 Cleaning-region Detection 22 3.3 Data Fusion Based on Fuzzy Reliability 23 3.3.1 Multi-measuring Mode 23 3.3.2 Fuzzy Reliability 24 3.4 Autonomous Trash-Dumping and Self-localization 25 3.4.1 Mark Detection 25 3.4.2 Mark Search and Robot Self-localization 26 3.5 Results and Discussions 26 Chapter 4 Machine Vision-based Mobile Robot Navigation System in an Unknown Environment……….. 28 4.1 System Skeleton 28 4.1.1 System Equipment 28 4.1.2 System Flowchart 29 4.2 Machine-vision Measurement, Object Detection and 3D Map-building 29 4.2.1 Machine Vision-based 3D Measurement 29 4.2.2 Object Detection 30 4.2.3 3D Map-building 31 4.3 Global Path Planning for Smoothing and Robot Localization 32 4.3.1 A* Path Planning Based on Floyd Path in an Unknown Environment 32 4.3.2 Path Smoothing 33 4.3.3 Robot Localization 35 4.4 Results and Discussions 36 Chapter 5 Human-Robot Interface System Based on Natural Human Gestures 38 5.1 System Skeleton 38 5.1.1 Image Pre-processing 39 5.1.2 Human Detection 40 5.1.3 Human Tracking 41 5.2 Human Gesture Detection 43 5.2.1 Hand Detection 43 5.2.2 Finger-pointing Direction Measurement 43 5.3 Dynamic Command Control of the HRI System 45 5.3.1 Combination of PCA and BPANN 45 5.3.2 Single Command 46 5.3.3 Successive Command 47 5.4 Results and Discussions…………………………………………………………………….. 47 Chapter 6 Conclusions and Future Suggestions 49 6.1 Conclusions 49 6.2 Future Suggestions 50 References ………………………………………………………………………………51 Tables ………………………………………………………………………………………61 Figures ………………………………………………………………………………….63 List of Tables Table 2.1 Fuzzy rule table of dual sensor module……………………………………. 61 Table 3.1 Symbol definition…………………………………………………………... 61 Table 4.1 Path planning simulation results…………………………………………… 61 Table 5.1 Facial geometric relationship rules.………………………………………... 62 Table 5.2 Fuzzy rule table of face tracking…………………………………………… 62 Table 5.3 Command database………………………………………………………… 62 List of Figures Figure 1.1 An intelligent outdoor clean robot system with friendly HRI……….……. 63 Figure 2.1 The outdoor clean robot, SI-lab AW Robot Ι..……………………………. 64 Figure 2.2 Cleaning mechanism of SI-lab AW Robot Ι……………………………… 64 Figure 2.3 Cleaning task flowchart of SI-lab AW Robot Ι…………………………… 65 Figure 2.4 Polaroid 6500 sonar ranging module……………………………………. 65 Figure 2.5 Experimental result of ultrasonic module within 110 cm……………………. 66 Figure 2.6 Single image measurement model………………………………………… 66 Figure 2.7 Experimental result of single image measurement model………………… 67 Figure 2.8 Characteristic function of supply batteries………………………………… 67 Figure 2.9 Clean-region detection for cleaning task of SI-lab AW Robot Ι…………… 68 Figure 2.10 4-direction search rule……………………………………………………… 69 Figure 2.11 Results of boustrophedon path planning…………………………………... 69 Figure 2.12 Line detection procedure and self-localization………………………………… 70 Figure 2.13 Boundary line for self-localization of a scene with a curb………………… 71 Figure 2.14 Boundary line for self-localization of a scene without any wall…………… 71 Figure 2.15 Experimental results of clean-region detection…………………………….. 72 Figure 2.16 Results of boustrophedon path planning…………………………………… 72 Figure 2.17 Experimental results of practical outdoor cleaning work…………………….. 73 Figure 3.1 Hardware framework of SI-lab AW Robot ΙI…………………….………… 74 Figure 3.2 Function architecture of SI-lab AW Robot ΙI ……………………………… 74 Figure 3.3 Track-driven module of SI-lab AW Robot ΙI……………………………… 75 Figure 3.4 PID controller with track-driven module………………………………….. 75 Figure 3.5 Cleaning mechanism of SI-lab AW Robot ΙI………………………………. 75 Figure 3.6 Switch of the gate of the trash box………………………………………………. 76 Figure 3.7 System flowchart of SI-lab AW Robot ΙI…………………………………... 77 Figure 3.8 Performance of dual sensor module……………………………………... 78 Figure 3.9 Image measuring results from 100 cm to 500 cm………………………….. 78 Figure 3.10 Sonar measuring results from 40 cm to 350 cm……………………………. 78 Figure 3.11 2D map-building for cleaning-region polygon………………………………… 79 Figure 3.12 Cleaning-region detection of SI-lab AW Robot ΙI…………………………. 79 Figure 3.13 Multi-measuring mode……………………………………………………………. 80 Figure 3.14 Fuzzy membership function of dual sensor module……………………….. 80 Figure 3.15 Example of multi-measuring mode and Fuzzy reliability………………...... 81 Figure 3.16 Autonomous trash-dumping flowchart…………………………………….. 82 Figure 3.17 Trash-dumping mark……………………………………………………….. 82 Figure 3.18 Mark detection………………………………..………………………………... 82 Figure 3.19 Cleaning-region detection of SI-lab AW Robot ΙI…………………………. 83 Figure 3.20 Cleaning region detection in the outdoor sandlot environment……………. 84 Figure 3.21 Boustrophedon complete coverage path planning for Fig. 3.20…………… 84 Figure 3.22 Outdoor cleaning in the sandlot environment……………………………… 85 Figure 3.23 Outdoor cleaning in the lawn environment………………………………… 86 Figure 4.1 System flowchart of robot auto-navigation…………………….…………... 87 Figure 4.2 3D measurement model from a single image ……………………………… 88 Figure 4.3 Reference white…………………………………………………………….. 89 Figure 4.4 Edge enhancement and area filtering………………………………………. 89 Figure 4.5 3D measurement and 3D map-building……………………………………. 90 Figure 4.6 A* path planning based on Floyd algorithm…………………………………… 91 Figure 4.7 Path smoothing…………………………………………………………….. 92 Figure 4.8 Tangent line restriction…………………………………………………….. 92 Figure 4.9 Path planning simulation…………………………………………………… 93 Figure 4.10 PID motor controller……………………………………………………….. 93 Figure 4.11 Path planning simulations compared with other methods……………………. 94-95 Figure 4.12 Practice navigation result of case 1………………………………………… 96 Figure 4.13 Practice navigation result of case 2……………………………………………… 97 Figure 4.14 Simulation navigation result of case 1…………………………………….. 98 Figure 4.15 Simulation navigation result of case 2…………………………………….. 98 Figure 4.16 3D map-building for practical navigation result 1…………………………. 99 Figure 4.17 3D map-building for practical navigation result 2…………………………. 99 Figure 5.1 Function framework of HRI system…………………….…………………. 100 Figure 5.2 System flowchart of HRI system ………………………………………….. 101 Figure 5.3 Skin-color filter…………………………………………………………….. 102 Figure 5.4 Simply varying ellipse template……………………………………………. 102 Figure 5.5 Fuzzy membership function of face tracking………………………………. 103 Figure 5.6 Hand segmentation………………………………………………………………… 103 Figure 5.7 Templates of hand gestures………………………………………………… 104 Figure 5.8 Finger-pointing direction definition………………………………………... 104 Figure 5.9 Shoulder detection…………………………………………………………. 105 Figure 5.10 Finger-tip detection………………………………………………………… 105 Figure 5.11 Finger-pointing direction measurement…………………………………….. 106 Figure 5.12 Dynamic successive command recognition………………………………... 107 Figure 5.13 Gesture command recognition on remote PC……………………………. 108 Figure 5.14 Image process on the clean robot…………………………………………... 109 Figure 5.15 Practical actions on the clean robot………………………………………… 110

    [1] Z. Li and A. Vetta, Bounds on the cleaning times of robot vacuums, Operations Research Letters, vol. 38(1), pp. 69-71, 2010.
    [2] X. Gao, Y. Wang, D. Zhou and K. Kikuchi, Floor-cleaning robot using omni-directional wheels, Industrial Robot, vol. 36(2), pp. 157-164, 2009.
    [3] L. P. Kalra and J. Gu, An autonomous self contained wall climbing robot for non-destructive inspection of above-ground storage tanks , Industrial Robot, vol. 34(2), pp 122-127, 2007.
    [4] T. Takeshita, T. Tomizawa and A. Ohya, A House Cleaning Robot System -Path indication and Position estimation using ceiling camera, in Proc. International Joint Conference on 2006 ICE-ICASE, pp. 2653-2656, 2006.
    [5] J. Palacin, X. Lasa and S. Marco, Straight-Line Path Following in Cleaning Robots Μsing Lateral Ultrasonic Sensors, in Proc. IEEE Conference on Instrumentation and Measurement Technology, vol. 2, pp. 1484 –1487, 2003.
    [6] W. H. Huang, Optimal Line-Sweep-based Decompositions for Coverage Algorithms, in Proc. of IEEE International Conference on Robotics and Automation, vol.1, pp.27-32, 2001.
    [7] Lawitzky G, A navigation system for cleaning robot, Autonomous Robots, vol. 9( 3), pp. 255-230, 2000.
    [8] P. Fiorini and E. Prassler, Cleaning and household robots: A technology survey, Autonomous Robots, vol. 9(3), pp. 227-235, 2000.
    [9] E. Prassler, A. Ritter , C. Schaeffer , P. Fiorini, A Short History of Cleaning Robots, in Proc. Autonomous Robots, pp.211-226, 2000.
    [10] G. Schmidt and C. Hofner, An Advanced Planning and Navigation Approach for Autonomous Cleaning Robot Operations, in Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 2, pp. 1230-1235, 1998.
    [11] Y. T. Lang and B. Y. Chee, Coordination of Behaviors for Mobile Robot Floor Cleaning, in Proc. IEEE International Conference on Intelligent Robots and Systems, pp.1236-1241, 1998.
    [12] J. Zhu, D. Sun and S. Tso, Application of a service climbing robot with motion planning and visual sensing, Journal of Robotic Systems, vol. 20, pp. 189-199 , 2003.
    [13] B. Astrand and A. Baerveldt, An agricultural mobile robot with vision-based perception for mechanical weed control, Autonomous Robots, vol. 13(1), pp. 21-35, 2002.
    [14] B. Astrand and A. Baerveldt, A vision based row-following system for agricultural field machinery, Mechatronics, vol. 15(2), pp. 251-269, 2005.
    [15] J. Oh, Y. Choi, J. Park and Y. Zheng, Complete coverage navigation of cleaning robots using triangular-cell-based map, IEEE Transactions on Industrial Electronics, vol. 51, pp.718-716, 2004.
    [16] T. Nishida, Y. Takemura, Y. Fuchikawa, S. Kurogi, S. Ito, M. Obata, N. Hiratsuka, H. Miyagawa, Y. Watanabe, F. Koga, T. Suehiro, Y. Kawamura, Y. Kihara, T. Kondo and F. Ohkawa, Development of Outdoor Service Robots, in Proc. SICE-ICCAS 2006, pp. 2052-2057, 2006.
    [17] T. Nishida, Y. Takemura, Y. Fuchikawa, S. Kurogi, S. Ito, M. Obata, N. Hiratsuka, H. Miyagawa, Y. Watanabe, T. Suehiro, Y. Kawamura, F. Ohkawa, Development of a sensor system for outdoor service robot, in Proc. 2006 SICE-ICASE International Joint Conference, pp. 2687-2691, 2006.
    [18] M. Obata, T. Nishida, H. Miyagawa, Y. Kondo and F. Ohkawa, Development of outdoor service robot to collect trash on streets, IEEJ Transactions on Electronics, Information and Systems, vol. 126(7), pp 840-848, 2006.
    [19] Y. Fuchikawa, T. Nishida, S. Kurogi, T. Kondo, F. Ohkawa, T. Suehiro, Y. Watanabe, Y. Kawamura, M. Obata, H. Miyagawa and Y. Kihara, Development of a vision system for an outdoor service robot to collect trash on streets, in Proc. the Eighth IASTED International Conference on Computer Graphics and Imaging, pp 100-105, 2005.
    [20] T. Sakai, Environmental Adaptive Navigation of SuiPPi, in Proc. International Symposium on Robotics, vol. 36, pp. 156-157, 2005.
    [21] T. Sakai, D. Nishimura, H. Uematsu, R. Murai, K. Mitani, T. Nakahara and Y. Kitano, Autonomous cleaning robot SuiPPi, Journal of Robot and Mechatronics, vol. 19(14), pp. 1A1-E03, 2007.
    [22] S. Amer, A. Shirkhodaie and H. Rababaah, UXO detection, characterization and remediation using intelligent robotic systems, in Proc. SPIE - The International Society for Optical Engineering, vol. 6953 , pp. 1-12, 2008.
    [23] M. Youngkak, K. Seungwoo, O. Dongik and C. Youngwan, A study on development of home mess-cleanup robot McBot, in Proc. IEEE/ASME International Conference on Advanced Intelligent Mechatronics, pp. 114-119, 2008.
    [24] Y. K. Hwang and N. Ahuja, A potential field approach to path planning, IEEE Transactions on Robotics and Automation, vol. 8(1), pp. 23-32, 1992.
    [25] M.H. Mabrouk and C.R. McInnes, Solving the potential field local minimum problem using internal agent states, Robotics and Autonomous Systems, vol. 56(12), 1050-1060, 2008.
    [26] J. Borenstein and Y. Koren, The Vector Field Histogram - Fast Obstacle Avoidance for Mobile Robots, IEEE Transactions on Robotics and Automation, vol. 7(3), pp. 278-288, 1991.
    [27] J. Borenstein and Y. Koren, Real-time obstacle avoidance for fast mobile robots, IEEE Transactions on Systems, Man and Cybernetics, 19(5), pp. 1179-1187, 1989.
    [28] S. Sarkar, S. Reynolds and E. Hall, Virtual force field based obstacle avoidance and agent based intelligent mobile robot, in Proc. SPIE-The International Society for Optical Engineering, pp. 67640R1-67640R12, 2007.
    [29] K. Ishay, R. Ehud and R. Elon, New range-sensor based globally convergent navigation algorithm for mobile robots, in Proc. IEEE International Conference on Robotics and Automation, vol. 1, pp. 429-435, 1996.
    [30] K. Ishay, R. Elon and R. Ehud, TangentBug A range-sensor-based navigation algorithm, International Journal of Robotics Research, 17(9), pp. 934-953,1998.
    [31] K. Ishay, R. Ehud and R. Elon, Range-sensor based navigation in three dimensions, in Proc. IEEE International Conference on Robotics and Automation, vol. 1, pp. 163-169, 1999.
    [32] N. James and B. Thomas, Performance Comparison of Bug Navigation Algorithms, Journal of Intelligent and Robotic Systems: Theory and Applications, 50(1), 73-84, 2007.
    [33] G. Gini and A. Marchi, Indoor Robot Navigation with Single Camera Vision, in Proc. 2nd International Workshop on Pattern Recognition in Information Systems, pp. 67-76, 2002.
    [34] V. Anand and A. Bala, Optimal Path Planning using an Improved A* Algorithm for Homeland Security Applications, in Proc. IASTED International Conf. on Artificial Intelligence and Applications, pp. 50-55, 2006.
    [35] F. Haro and M. Torres, A Comparison of Path Planning Algorithms for Omni-Directional Robots in Dynamic Environments, in Proc. 3rd IEEE Latin American Robotics Symposium, pp. 18-25, 2006.
    [36] M. Gemeinder and M. Gerke, A GA-based path planning for mobile robot systems employing an active search algorithm, Applied Soft Computing Journal, vol. 3(2), pp. 149-158, 2003.
    [37] T. Manikas, K. Ashenayi, & R. Wainwright, Genetic algorithms for autonomous robot navigation, IEEE Instrumentation and Measurement Magazine, vol. 10(6), 26-31, 2007.
    [38] K. Sedighi, K. Ashenayi, T. Manikas, R. Wainwright and H. Tai, Autonomous local path planning for a mobile robot using a genetic algorithm, in Proc. the 2004 Congress on Evolutionary Computation, vol. 2, pp. 1338-1345, 2004.
    [39] B. Reddy, B. Kimiaghalam and A. Homaifar, Reactive real time behavior for mobile robots in unknown environments, in Proc. IEEE International Symposium on Industrial Electronics, vol. 1, pp. 693-697, 2004.
    [40] B. Kimiaghalam, A. Homaifar, B. Suttikulvet and B. Sayyarrodsari, A multi-layered multi fuzzy inference systems for autonomous robot navigation and obstacle avoidance, in Proc. 10th IEEE International Conf. on Fuzzy Systems, pp. 340-343, 2001.
    [41] T. Zheng, Real-time motion planning for a mobile robot in an unknown environment using a neuro-fuzzy based approach, in Proc. Control systems and Robotics International Conf. on Mechatronics and Information Technology, pp. 20-23, 2005.
    [42] M. Wang and J. Liu, Fuzzy logic-based real-time robot navigation in unknown environment with dead ends, Robotics and Autonomous Systems, vol. 56(7), pp. 625-643, 2008.
    [43] M. Wang and J. Liu, Fuzzy logic based active map learning for autonomous robot, in Proc. IEEE International Conference on Fuzzy Systems, pp. 2134-2141, 2006.
    [44] M. Wang and J. Liu, Fuzzy logic based robot path planning in unknown environment, in Proc. 2005 International Conference on Machine Learning and Cybernetics, pp. 813-818, 2005.
    [45] M. Wang and J. Liu, Autonomous robot navigation using fuzzy logic controller, in Proc. 2004 International Conference on Machine Learning and Cybernetics, vol. 2, pp. 691-696, 2004.
    [46] H. K. Keskinpala,and J. A. Adams, Objective data analysis for a PDA-based human-robotic interface, in Proc. IEEE International Conference on Systems, Man and Cybernetics, vol. 3, pp 2809-2814, 2004.
    [47] H.K Keskinpala, J.A. Adams, and K. Kawamura, PDA-based human-robotic interface, in Proc. IEEE International Conference on Systems, Man and Cybernetics, pp. 3931-3936, 2003.
    [48] T. Kobayashi, Y. Ogawa, K. Kato and K. Yamamoto, Learning system of human facial expression for a family robot, in Proc IEEE International Conference on Computer Society, pp.481-.486, 2004.
    [49] T. Kobayashi, Y. Ogawa, K. Kato and K. Yamamoto, Expression learning and recognition system for a family robot, in Proc. 1st Canadian Conference on Computer and Robot Vision, pp. 259-264, 2004.
    [50] S. Kumar and A. Sekmen, Single robot - Multiple human interaction via intelligent user interfaces, Knowledge-Based Systems, pp. 458-465, 2008.
    [51] S. Birchfield, Elliptical head tracking using intensity gradients and color histograms, in Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 232-237, 1998.
    [52] S. Birchfield, Elliptical head tracker, in Proc. the Asilomar Conference on Signals, Systems and Computers, vol. 2, pp. 1710-1714, 1998.
    [53] C. Han, H. M. Liao, G. Yu and L. Cheng, Fast face detection via morphology-based pre-processing, Pattern Recognition, vol. 33(10), pp. 1701-1712, 2000.
    [54] C. Han, H. M. Liao, G. Yu and L. Cheng, Fast face detection via morphology-based pre-processing, Lecture Notes in Computer Science, vol. 1311, pp. 469, 1997.
    [55] R. Hsu, M. Abdel-Mottaleb and A. K. Jain, Face detection in color images , in Proc. IEEE International Conference on Image Processing, vol. 1, pp. 1046-1049, 2001.
    [56] R. Hsu and A. K. Jain, Face modeling for recognition, in Proc. IEEE International Conference on Image Processing, vol. 2, pp. 693-696, 2001.
    [57] A. Panning, A. Al-Hamadi, R. Niese and B. Michaelis, Facial expression recognition based on Haar-like feature detection, International Journal of Pattern Recognition and Image Analysis, pp. 447-452, 2008.
    [58] F. Y. Shin, C. Chuang, and P. S. C. Wang, Performance Comparisons of Facial Expression Recognition in JAFFE Database, International Journal of Pattern Recognition and Artificial Intelligence, vol. 22, pp. 445-459. , 2008
    [59] F. Y. Shih, S. Cheng, C. F. Chuang and P. S. P. Wang, Extracting faces and facial features from color images, International Journal of Pattern Recognition and Artificial Intelligence, vol. 22(3), pp 515-534, 2008
    [60] S. Brahnam, C. F. Chuang, R. Sexton, F. Y. Shih, Machine assessment of neonatal facial expressions of acute pain, Decision Support Systems, vol. 43(4), pp 1242-1254, 2007.
    [61] S. Brahnam, C. F. Chuang, F. Y. Shih, M. Slack, SVM classification of neonatal facial images of pain, in Proc. 6th International Workshop on Fuzzy Logic and Applications, vol. 3849, pp 121-128, 2006.
    [62] C. F. Chuang and F. Y. Shih, Recognizing facial action units using independent component analysis and support vector machine, Pattern Recognition, vol. 39(9), pp. 1795-1798, 2006
    [63] F. Y. Shih and C. F. Chuang, Automatic extraction of head and face boundaries and facial features, Information Sciences, vol. 158(1-4), pp. 117-130, 2004.
    [64] P. Campadelli, R. Lanzarotti and G Lipori, Precise Eye and Mouth Localization, International Journal of Pattern Recognition and Artificial Intelligence, vol. 23, pp. 359-377, 2009.
    [65] P. Campadelli, R. Lanzarotti and G. Lipori, Eye localization for face recognition, RAIRO - Theoretical Informatics and Applications, vol. 40(2), pp. 123-139, 2006.
    [66]S. Arca, P. Campadelli and R. Lanzarotti, A face recognition system based on automatically determined facial fiducial points, Pattern Recognition, v 39(3), pp 432-443, 2006.
    [67]S. Area, P. Campadelli, E. Casiraghi Elena and R. Lanzarotti, An automatic feature based face authentication system, Lecture Notes in Computer Science, vol. 3931, pp. 120-126, 2006.
    [68] S. Arca, P. Campadelli, R. Lanzarotti and G. Lipori, A face recognition system dealing with expression variant faces, in Proc. International Conference on Pattern Recognition, vol. 2, pp. 1242-1245, 2006.
    [69] P. Campadelli, R. Lanzarotti and G. Lipori, Face localization in color images with complex background, in Proc. International Workshop on Computer Architecture for Machine Perception, pp. 243-248, 2005.
    [70] P. Campadelli and R. Lanzarotti, A face recognition system based on local feature characterization, Lecture Notes in Computer Science, vol. 3161, pp. 147-152, 2005.
    [71] P. Campadelli, R. Lanzarotti, G. Lipori and E. Salvi, Face and facial feature localization, Lecture Notes in Computer Science, vol. 3617, pp. 1002-1009, 2005.
    [72] S. Area, P. Campadelli and R. Lanzarotti, An efficient method to detect facial fiducial points for face recognition, in Proc. International Conference on Pattern Recognition, vol. 1, pp. 532-535, 2004.
    [73] P. Campadelli and R. Lanzarotti, Fiducial point localization in color images of face foregrounds, Image and Vision Computing, vol. 22(11), pp 863-872, 2004.
    [74] P. Campadelli, R. Lanzarotti and G. Lipori, Face detection in color images of generic scenes, in Proc. 2004 IEEE International Conference on Computational Intelligence for Homeland Security and Personal Safety, pp. 97-103, 2004.
    [75] M. Turk and A.P. Pentland, Eigenfaces for recognition, Journal of Cognitive Neuro-science, vol. 3(1), pp.71-86, 1991.
    [76] M. Turk and A.P. Pentland, Face recognition using eigenfaces, in Proc IEEE Conference on Computer Vision and Patten. Recognition, pp. 586-591, 1991.
    [77] K.K. Sung and T. Poggio, Example-based learning for view-based human face detection, IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 39-51, 1998.
    [78] K.K. Sung and T. Poggio, Learning a distribution-based face model for human face detection, in Proc. IEEE Workshop on Neural Networks for Signal Processing, pp. 398-406, 1995.
    [79] A. Martinez, M. H. Yang and D. Kriegman, Special issue on face recognition, Computer Vision and Image Understanding, vol. 91(1-2), pp 1-5, 2003.
    [80] M. H. Yang, D. J. Kriegman, and N. Ahuja, Detecting faces in images: A survey, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 1, pp. 34-58, 2002.
    [81] M. H. Yang, D. Kriegman and N. Ahuja, Face detection using multimodal density models, Computer Vision and Image Understanding, vol. 84(2), pp. 264-284, 2002.
    [82] M. H. Yang, N. Ahuja, and D. Kriegman, Face detection using a mixture of factor analyzers, in Proc. IEEE International Conference on Image Processing, vol. 3, pp. 612-616, 1999.
    [83] M. H. Yang, N. Ahuja, and D. Kriegman, Face recognition using Kernel eigenfaces, in Proc. IEEE International Conference on Image Processing, vol. 1, pp. 37-40, 2000.
    [84] H. Yang, A. Park, S. Lee, Gesture spotting and recognition for human-robot interaction, IEEE Transactions on Robotics, vol. 23(2), pp. 256-270, 2007.
    [85] H. Yang and S. Lee, Reconstruction of 3D human body pose from stereo image sequences based on top-down learning, Pattern Recognition, vol. 40(11), pp. 3120-3131, 2007.
    [86] H. Yang, S. Lee and S. Lee, Multiple Human Detection and Tracking Based on Weighted Temporal Texture Features, International Journal of Pattern Recognition and Artificial Intelligence, vol. 20, pp. 377-391, 2006.
    [87] H. Yang and S. Lee, Reconstruction of 3D human body pose for gait recognition, in Proc International Conference on Biometrics, vol. 3832, pp. 619-625, 2006.
    [88] H. Yang and S. Lee, Reconstructing 3D human body pose from stereo image sequences using hierarchical human body model learning , in Proc. International Conference on Pattern Recognition, vol. 3, pp. 1004-1007, 2006
    [89] H. Yang, S. Park and S. Lee, Robust spotting of key gestures from whole body motion sequence, in Proc. the 7th International Conference on Automatic Face and Gesture Recognition, vol. 2006, pp. 231-236, 2006.
    [90] H. Yang, S. Park and S. Lee, Reconstruction of 3D human body pose based on top-down learning, in Proc. International Conference on Intelligent Computing, vol. 3644, pp. 601-610, 2005.
    [91] H. Yang and S. Lee, Multiple pedestrian detection and tracking based on weighted temporal texture features, in Proc. the 17th International Conference on Pattern Recognition, vol. 4, pp. 248-251, 2004.
    [92] M. Gunther and R. P. Wurtz, Face Detection and Recognition Using Maximum Likelihood Classifiers on Gabor Graphs, International Journal of Pattern Recognition and Artificial Intelligence, vol. 23, pp. 433-461, 2009.
    [93] A. Tewes, R. Wurtz, and C. Von Der Malsburg A flexible object model for recognising and synthesising facial expressions, in Proc. the 5th International Conference on Audio- and Video-Based Biometric Person Authentication, vol. 3546, pp. 81-90, 2005.
    [94] M. Muller and P. Wurtz, Learning from examples to generalize over pose and illumination, in Proc. 19th International Conference on Artificial Neural Networks, vol. 5769, pp. 643-652, 2009.
    [95] P. Schmidt, E. Mael and R. Wurtz, A sensor for dynamic tactile information with applications in human-robot interaction and object exploration, Robotics and Autonomous Systems, vol. 54(12), pp. 1005-1014, 2006.
    [96] A. Ramamoorthy, N. Vaswani, S. Chaudhury and S. Banerjee, Recognition of dynamic hand gestures, Pattern Recognition, vol. 36(9), pp. 2069-2081, 2003.
    [97] S. Mitra and T. Acharya, Gesture Recognition: A Survey, IEEE Transactions on Systems, Man, and Cybernetics, vol. 37(3), pp. 311-324, 2007.
    [98] J. Peltason, F. Siepmann, T. Spexard, B. Wrede, M. Hanheide, E. Topp, Mixed-initiative in human augmented mapping, in Proc. IEEE International Conference on Robotics and Automation, pp. 2146-2153, 2009.
    [99] T. Spexard, M. Hanheide, and G. Sagerer, Human-Oriented Interaction with an Anthropomorphic Robot, IEEE Transactions on Robotics, vol. 23(5), pp. 852-862, 2007.
    [100] T. Spexard, S. Li, B. Wrede, M. Hanheide, E. Topp, H. Huttenrauch, Interaction awareness for joint environment exploration, in Proc. 16th IEEE International Workshop on Robot and Human Interactive Communication, pp. 546-551, 2007.

    下載圖示 校內:2015-06-07公開
    校外:2015-06-07公開
    QR CODE