簡易檢索 / 詳目顯示

研究生: 呂明峰
Lu, Ming-Feng
論文名稱: 以SOPC為基礎的小型人形機器人視覺與控制系統之設計與實現
Design and Implementation of SOPC Based Image and Control System for Small Size Humanoid Robot
指導教授: 李祖聖
Li, Tzuu-Hseng S.
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 電機工程學系
Department of Electrical Engineering
論文出版年: 2007
畢業學年度: 95
語文別: 英文
論文頁數: 90
中文關鍵詞: 影像人形機器人
外文關鍵詞: image, humanoid robot
相關次數: 點閱:65下載:3
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本論文係探討以SOPC為基礎的小型人形機器人之視覺系統與控制策略的發展與建立。其中,視覺系統的主要功能為圖形辨識。首先由CMOS感測器擷取影像畫面後,以濾波器移除顏色資訊並排除雜訊干擾,再以邊緣處理將所需影像特徵取出並加以分析,最後經由斜率與方程式的計算完成幾何圖形與箭頭之辨識。而控制策略則是以FIRA競賽中的PK與其他項目為主,描述機器人如何完成進攻、防守等其他策略。透過驅動兩顆小型伺服馬達控制CMOS感測器,使機器人經由模糊控制器搜尋並追蹤特定目標物,接著再擷取畫面,並分析場地上的狀況來決定機器人的動作。本論文首先會探討機器人的機構,包含馬達、處理中心,電力與視覺系統。接著會針對影像方面做詳細的說明,包含辨識幾何圖形與箭頭方向的演算法、追蹤目標物的方法,以及PK和其他項目的控制策略。最後透過實驗與FIRA HuroCup之競賽結果,可充分展現本小型人形機器人的圖形辨識能力及其優越的效能與強健性。

    This thesis mainly concerned about the development of an SOPC based image and control strategy system for a small size humanoid robot. The main function of the image system is pattern recognition. Several geometric figures and the direction of arrows can be recognized in a complex environment at the same time. The image information is captured by the CMOS sensor. The colors and noise will be filtered out first. Moreover, the edge detection approach is utilized to find the feature points so as to analyze the image. The control strategy conforms to the laws of Penalty Kick and other events of the FIRA competition. We describe the methods of the offense and defense modes and other strategies. Our robot can track specific objects via the CMOS sensor controlled by two small servomotors. The motions of the robot are decided by analyzing the condition of the environment from the captured image. This thesis begins by discussing the mechanism of the robot, including the actuators, central process unit, power, and image systems. Furthermore, it describes how to deal with the complex background and the algorithms of pattern recognition in detail. The method for tracking objects and the control strategy for Penalty Kick and other events will be illustrated. Finally, the experiment results show the capability of pattern recognition and the efficiency and validity in the HuroCup of the FIRA.

    Abstract Ⅰ Acknowledgment Ⅲ Contents Ⅳ List of Figures Ⅵ List of Tables Ⅹ Chapter 1. Introduction 1 1.1 Motivation 1 1.2 Thesis Organization 3 Chapter 2. Mechanism and Hardware of the Small Size Humanoid Robot 4 2.1 Introduction 4 2.2 Design of Mechanism 5 2.3 The Hardware and Software of the Robot 9 2.3.1 Actuator 11 2.3.2 Central Process Unit 12 2.3.3 Power System 15 2.3.4. Vision System 16 2.4 Summary 18 Chapter 3. Vision System 19 3.1 Introduction 19 3.2 Image Format 20 3.3 Pattern Recognition 21 3.3.1 Searching the Targets in a Complex Background 22 3.3.2 Edge Detection Approach 27 3.3.3 The Algorithm of Pattern Recognition 29 3.4 Summary 44 Chapter 4. Control Strategy System 45 4.1 Introduction 45 4.2 The Fuzzy Based 2D Tracking Method 46 4.2.1 The Principle of Tracking Command 46 4.2.2 The Fuzzy Logic Controller for Tracking Motors 48 4.3 The Control Strategy for Penalty Kick 57 4.3.1 Overview of the Penalty Kick Event 57 4.3.2 The Strategy for Offense Mode 58 4.3.3 The Strategy for Defense Mode 65 4.4 The Control Strategy for Other Events 66 4.4.1 Overview of Other Events 66 4.4.2 The Strategy for Basketball Event 67 4.4.3 The Strategy for Marathon Event 69 4.5 Summary 70 Chapter 5. Experimental Results 71 5.1 Introduction 71 5.2 Experimental Results of Pattern Recognition 72 5.3 Experimental Results of Strategies 79 Chapter 6. Conclusions and Future Works 84 6.1 Conclusions 84 6.2 Future Works 85 References 87 Biography 90

    [1] P. Sardain and G. Bessonnet, “Forces acting on a biped robot. Center of pressure-zero moment point,” IEEE Transactions on Systems, Man and Cybernetics, Part A, vol. 34, no. 5, pp. 630-637, September 2004.
    [2] P. Sardain and G. Bessonnet, “Zero moment point-measurements from a human walker wearing robot feet as shoes,” IEEE Transactions on Systems, Man and Cybernetics, Part A, vol. 34, no. 5, pp. 638-648, September 2004.
    [3] A. Rowe, C. Rosenberg, and I. Nourbakhsh, “A low cost embedded color vision system,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 208-213, 2002.
    [4] A. Rowe, C. Rosenberg, and I. Nourbakhsh, “A simple low cost color vision system,” in Technical Sketch Session, CVPR, October 2001.
    [5] S. Suthaharan, “Image and edge detail detection algorithm for object-based coding,” Pattern Recognition Letters, vol. 21, nos. 6-7, pp. 549-557, June 2000.
    [6] S. T. Acton and D. P. Mukherjee, “Area operators for edge detection,” Pattern Recognition Letters, vol. 21, no. 8, pp. 771-777, July 2000.
    [7] Y. H. Yu and C. C. Chang, “A new edge detection approach based on image context analysis,” Image and Vision Computing, vol. 24, no. 10, pp. 1090-1102, October 2006.
    [8] Z. Hou and T. S. Koh, “Robust edge detection,” Pattern Recognition, vol. 36, no. 9, pp. 2083-2091, September 2003.
    [9] A. H. S. Lai and N. H. C. Yung, “Lane detection by orientation and length discrimination,” IEEE Transactions on Systems, Man and Cybernetics, Part B, vol. 30, no. 4, pp. 539-548, August 2000.
    [10] B. G. Park, K. M. Lee, and S. U. Lee, “Face recognition using face-ARG matching,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 12, pp. 1982-1988, December 2005.
    [11] M. E. Munich, P. Pirjanian, E. D. Bernardo, L. Goncalves, N. Karlsson, and D. Lowe, “SIFT-ing through features with ViPR,” IEEE Robotics & Automation Magazine, vol. 13, no. 3, pp. 72-77, September 2006.
    [12] S. Linlin and B. Li, “MutualBoost learning for selecting Gabor features for face recognition,” Pattern Recognition Letters, vol. 27, no. 15, pp. 1758-1767, November 2006.
    [13] T. Ahonen, A. Hadid, and M. Pietikainen, “Face description with local binary patterns: application to face recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 12, pp. 2037-2041, December 2006.
    [14] Y. Gizatdinova and V. Surakka, “Feature-based detection of facial landmarks from neutral and expressive facial images,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 1, pp. 135-139, January 2006.
    [15] Z. Tao and H. Nguyen, “Fuzzy control for tracking and handling of moving objects,” in IEEE Pacific Rim Conference on Communications, Computers and Signal Processing, vol. 1, pp. 35-38, May 1993.
    [16] A. Koschan, S. Kang, J. Paik, B. Abidi, and M. Abidi, “Color active shape models for tracking non-rigid objects,” Pattern Recognition Letters, vol. 24, no. 11, pp. 1751-1765, July 2003.
    [17] É. Marchand and F. Chaumette, “Feature tracking for visual servoing purposes,” Robotics and Autonomous Systems, vol. 52, no. 1, pp. 53-70, July 2005.
    [18] J. P. Barreto, J. Batista, and H. Araujo, “Solutions for visual control of motion: active tracking applications,” in 8th IEEE Mediterranean Conference on Control and Automation, 2000.
    [19] T. Wilhelm, H. J. Böhme, and H. M. Gross, “A multi-modal system for tracking and analyzing faces on a mobile robot,” Robotics and Autonomous Systems, vol. 48, no. 1, pp. 31-40, August 2004.
    [20] E. Menegatti, A. Pretto, A. Scarpa, and E. Pagello, “Omnidirectional vision scan matching for robot localization in dynamic environments,” IEEE Transactions on Robotics, vol. 22, no. 3, pp. 523-535, June 2006.
    [21] G. Adorni, S. Cagnoni, S. Enderle, G. K. Kraetzschmar, M. Mordonini, M. Plagge, M. Ritter, S. Sablatnög, and A. Zell, “Vision-based localization for mobile robots,” Robotics and Autonomous Systems, vol. 36, nos. 2-3, pp. 103-119, August 2001.
    [22] G. S. Gupta, C. H. Messom, and S. Demidenko, “Real-time identification and predictive control of fast mobile robots using global vision sensing,” IEEE Transactions on Instrumentation and Measurement, vol. 54, no. 1, pp. 200-214, February 2005.
    [23] H. Tamimi, H. Andreasson, A. Treptow, T. Duckett, and A. Zell, “Localization of mobile robots with omnidirectional vision using Particle Filter and iterative SIFT,” Robotics and Autonomous Systems, vol. 54, no. 9, pp. 758-765, September 2006.
    [24] J. Gaspar, N. Winters, and J. Santos-Victor, “Vision-based navigation and environmental representations with an omnidirectional camera,” IEEE Transactions on Robotics and Automation, vol. 16, no. 6, pp. 890-898, December 2000.
    [25] L. D. Stefano and S. Mattoccia, “Fast template matching using bounded partial correlation,” Machine Vision and Applications, vol. 13, pp. 213-221, February 2003.
    [26] M. F. Ercan and Y. F. Fung, “Connected component labeling on a one dimensional DSP array,” in Proceedings of the IEEE Region 10 Conference, vol. 2, pp. 1299-1302, September 1999.
    [27] X. D. Tian, H. Y. Li, X. F. Li, and L. P. Zhang, “Research on symbol recognition for mathematical expressions,” in International Conference on Innovative Computing, Information and Control, vol. 3, pp. 357-360, August 2006.
    [28] G. Klanˇcar, O. Orqueda, D. Matko, and R. Karba, “Robust and efficient vision system for mobile robots control-application to soccer robots,” Electrochemical Review, vol. 68, pp. 306-312, May 2001.
    [29] M. Zhang and N. D. Georganas, “Fast color correction using principal regions mapping in different color spaces,” Real-Time Imaging, vol. 10, no. 1, pp. 23-30, February 2004.
    [30] T. Mukai and N. Ohnishi, “Recovery of shape and scale of an object using a CCD camera and an acceleration-gyro sensor,” in IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 1, pp. 70-76, 1999.
    [31] R. C. Gonzales and R. E. Woods, Digital Image Processing. Addison-Wesley, 1992.
    [32] W. K. Pratt, Digital Image Processing. Wiley, New York, 1978.
    [33] 楊武智編著,影像處理與辨認,全華科技出版社,1994。
    [34] T. H. Lee, H. K. Lam, F. H. F. Leung, and P. K. S. Tam, “A practical fuzzy logic controller for the path tracking of wheeled mobile robots,” IEEE Control Systems Magazine, vol. 23, no. 2, pp. 60-65, April 2003.
    [35] L. A. Zadeh, “Fuzzy algorithm,” Information Control, vol. 12, pp. 94-102, 1968.
    [36] L. A. Zadeh, “Fuzzy sets,” Information Control, vol. 8, pp. 338-353, 1965.
    [37] P. Koseeyaporn, “Continuous surface tracking for welding robot,” in IEEE Region 10 Conference, vol. 4, pp. 491-494, November 2004.
    [38] X. Jiang, Y. Motai, and X. Zhu, “Predictive fuzzy logic controller for trajectory tracking of a mobile robot,” in Proceedings of the 2005 IEEE Mid-Summer Workshop on Soft Computing in Industrial Applications, pp. 29-32, June 2005.

    下載圖示 校內:2009-07-17公開
    校外:2009-07-17公開
    QR CODE