| 研究生: |
張峻銘 Chang, Chun-Ming |
|---|---|
| 論文名稱: |
人形機器人足球賽視覺與策略系統之設計與實現 Design and Implementation of Vision and Strategy System for Humanoid Robot Soccer Competition |
| 指導教授: |
李祖聖
Li, Tzuu-Hseng S. |
| 學位類別: |
碩士 Master |
| 系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
| 論文出版年: | 2009 |
| 畢業學年度: | 97 |
| 語文別: | 英文 |
| 論文頁數: | 95 |
| 中文關鍵詞: | 團隊合作策略 、自主定位 、控制策略 、機器人足球賽 、影像處理系統 |
| 外文關鍵詞: | collaboration strategy, control strategy, image processing system, robot soccer game, localization |
| 相關次數: | 點閱:147 下載:3 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本論文主要在探討人形機器人足球賽所使用的視覺與策略系統。由於人形機器人足球比賽提供了一個動態而且多樣化的環境,因此可以做為多方面研究題材的測試平台,包括運動控制、影像處理、自主定位和團隊合作策略。影像畫面由CMOS感測器擷取,並且使用PDA作為嵌入式的影像處理系統。本篇提出了一個快速的影像區塊切割演算法,同時改良了傳統的隨機測圓法來更快速的完成球體的辦識。場地上的線特徵使用端點疊代法來偵測,並且使用顏色的資訊來辨識球門和球柱。所有的控制策略皆是以NIOS的FPGA控制板來運算。機器人的自主定位是使用常態分佈的蒙地卡羅法,並使用機率分佈的格子地圖來記錄定位的資訊。球的追蹤是使用模糊控制器驅動頭部的兩顆直流馬達來達成。機器人之間透過無線網路來達成團隊合作策略,並使用動態指派角色的方式達到最佳的進攻和防守效能。最後透過實驗與競賽結果,可以充分展現本人形機器人在視覺與策略上優越的效能與強健性。
The robot soccer game presents a dynamic and complex environment. It is a challenging platform for multi-agent research, involving many topics such as motion control, image processing algorithm, localization, and collaboration strategy. This thesis mainly concerns about the development of the vision and strategy system for humanoid robot soccer competition. The image information is captured by the CMOS sensor, and computed by an embedded image processing system, which is implemented on a PDA. The main function of the image system is feature recognition. Most obvious features on the field can be recognized efficiently. A fast object segmentation method in the image with the need of only one execution loop is proposed. An improved Randomized Circle Detection (RCD) method is adopted to realize the ball recognition. The recognitions of the field line features are obtained using the Iterative End-Points Fitting Algorithm (IEPFA). The border points of the goals and landmark poles can also be calculated correctly. The control strategy system is implemented on a NIOS FPGA board. The localization method of our robot is realized via the Uniform Monte Carlo Localization (Uniform MCL) and the robot’s belief in its position is represented by a position probability grid map. We develop a fuzzy controller for the ball tracking function. The collaboration strategy in the competition is also examined. Finally, the experiment results demonstrate the capability of vision and strategy system, and the efficiency and validity in the humanoid soccer competition of RoboCup.
[1] RoboCup, http://www.robocup.org/.
[2] FIRA, http://www.fira.net/.
[3] H. Messom, S. Gupta, and N. Demidenko, “Hough transform run length encoding for real-time image processing,” IEEE Transactions on Instrumentation and Measurement, vol. 56, no. 3, pp. 962-967, June 2007.
[4] J. Davison, D. Reid, D. Molton, and O. Stasse, “MonoSLAM: real-time single camera SLAM,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 6, pp. 1052-1067, June 2007.
[5] J. Schmudderich, V. Willert, J. Eggert, S. Rebhan, C. Goerick, and G. Sagerer, “Estimating object proper motion using optical flow, kinematics, and depth information,” IEEE Transactions on Systems, Man, and Cybernetics, Part B, vol. 38, no. 4, pp. 1139-1151, Aug. 2008.
[6] T. Gevers and H. Stokman, “Classifying color edges in video into shadow-geometry, highlight, or material transitions,” IEEE Transactions on Multimedia, vol. 5, no. 2, pp. 237-243, June 2003.
[7] G. S. Gupta and D. bailey, “Discrete YUV look-up tables for fast colour segmentation for robotic applications,” in Proc. IEEE Int. Conf. on Electrical and Computer Engineering, pp. 963-968, May 2008.
[8] Q. Chen, H. Xie, and P. Woo, “Vision-based fast objects recognition and distances calculation of robots,” in Proc. IEEE Int. Conf. on Industrial Electronics Society, pp. 6-10, Nov. 2005.
[9] J. Fasola and M. Veloso, “Real-time object detection using segmented and grayscale images,” in Proc. IEEE Int. Conf. on Robotics and Automation, vol. 5, pp. 4088-4093, May 2006.
[10] Y. T. Su, Development and implementation of visual and control systems for humanoid robot, Master Thesis, Dept. of E.E., N.C.K.U., Taiwan, June 2006.
[11] C. Y. Hu, FPGA-based fuzzy controller and image processing system for small-sized humanoid robot, Master Thesis, Dept. of E.E., N.C.K.U., Taiwan, June 2007.
[12] T. C. Chen and K. L. Chung, “An efficient randomized algorithm for detecting circles,” Computer Vision and Image Understanding, Academic Press, pp. 172-191, 2001.
[13] B. Lamiroy, L. Fritz, and O. Gaucher, “Robust circle detection,” in Proc. Int. Conf. on Document Analysis and Recognition, vol. 1, pp. 526-530, 2007.
[14] R. O. Duda and P. E. Hart, "Use of the Hough transformation to detect lines and curves in pictures," Commun. ACM, vol. 15, no. 1, pp. 11-15, 1972.
[15] L. Xu, E. Oja, and P. Kultanan, "A new curve detection method : randomized Hough transform (RHT)," Pattern Recognition Letters, vol. 11, no. 5, pp.331-338, 1990.
[16] Shapiro, Linda, and Stockman, George, Computer Vision, Prentice-Hall, Inc. 2001.
[17] W. Xun, J. Jianqiu, L. Yun, and J. Zhaoyi, “Multi-feature SUSAN corner detection method,” in Proc. IEEE Int. Conf. on Image Analysis Techniques, vol. 6044, 2005.
[18] S. M. Smith and J. M. Brady, “SUSAN-A new approach to low level image processing,” International Journal of Computer Visio, vol. 23, no. 1, pp. 45-78, 1997.
[19] R.O. Duda and P.E. Hart. Pattern Classification and Scene Analysis. New York: John Wiley & Sons, 1973.
[20] The rules of RoboCup Soccer Humanoid League 2009,
http://www.tzi.de/humanoid/bin/view/Website/Downloads/.
[21] J. S. Lee and Y. H. Jeong, “CCD camera calibrations and projection error analysis,” in Proc. IEEE Int. Conf. on Science and Technology, vol. 2, pp. 50-55, July 2000.
[22] S. Ernst, C. Stiller, J. Goldbeck, and C. Roessig, “Camera calibration for lane and obstacle detection,” in Proc. IEEE/IEEJ/JSAI Int. Conf. on Intelligent Transportation System, pp. 356-361, Oct 1999.
[23] J. Heikkila and O. Silven, “A four-step camera calibration procedure with implicit image correction,” in Proc. IEEE Int. Conf. on Computer Vision and Pattern Recognition, vol. 3, pp. 1106-1112, June 1997.
[24] Y. S. Chen, S. W. Shih, Y. P. Hung, and C. S. Fuh, “Camera calibration with a motorized zoom lens,” in Proc. IEEE Int. Conf. on Pattern Recognition, vol. 4, pp. 495-498, Sep. 2000.
[25] Camera calibration tools, http://ubimon.doc.ic.ac.uk/dvs/m581.html/.
[26] Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp. 1330-1334, Nov 2000.
[27] K. T. Holland, R. A. Holman, T. C. Lippmann, and J. Stanley, “Practical use of video imagery in nearshore oceanographic field studies,” IEEE Journal of Oceanic Engineering, vol. 22, no. 1, pp. 81-92, Jan 1997.
[28] Y. I. Abdel-Aziz and H. M. Karara, “Direct linear transformation from comparator coordinates into object space coordinates in close-range photogrammetry,” in Proc. ASP/UI Symp. Close-Range Photogrammetry, Urbana, IL, pp. 1–18, 1971.
[29] T. R¨ofer, T. Laue, and D. Thomas, “Particle-filter-based self-localization using
landmarks and directed lines,” RoboCup 2005: Robot Soccer World Cup IX,
Lecture Notes in Artificial Intelligence, Springer, 2005.
[30] S. Lenser and M. Veloso, “Sensor resetting localization for poorly modeled mobile
robots,” in Proc. of the IEEE International Conference on Robotics and Automation
(ICRA), 2002.
[31] I. Harmati and K. Skrzypczyk, “Robot team coordination for target tracking using fuzzy logic controller in game theoretic framework,” in Proc. of IEEE ICRA, pp.75-86, Jan. 2009.
[32] R. Ueda, “Uniform Monte Carlo Localization - Fast and robust self-localization method for mobile robots,” in Proc. of IEEE ICRA, pp.1353-1358, 2002.
[33] P. Buschka, A. Saffiotti, and Z. Wasik, “Fuzzy landmark-based localization for a legged robot,” in Proc. Int. on Intelligent Robots and Systems, pp. 1205-1210, 2000.
[34] D. Herrero-Perez and H. Martinez-Barbera, “Fast and robust recognition of field line intersections,” in Robotics Symposium, 2006. LARS ‘06. IEEE 3rd Latin American, pp. 115-119, 2006.
[35] H. S. Kim, H. S. Shim, M. J. Jung, and J. H. Kim, “Action selection mechanism for soccer robot,” in Proc. Int. on Computational Intelligence in Robotics and Automation, pp. 390-395, 1997.
[36] H. S. Shim, Y. G. Sung, S. H. Kim, and J. H. Kim, “Design of action level in a hybrid Control Structure for vision based soccer robot,” in Proc. Int. on Intelligent Robots and Systems, pp. 1406-1411, 1999.
[37] H. L. Sng, G. S. Gupta, and C. H. Messom, ”Strategy for collaboration in robot soccer,” in Proc. Int. on Electronic Design, Test and Application, pp. 347-351, 2002.