| 研究生: |
徐嘉明 Shyu, Jia- Ming |
|---|---|
| 論文名稱: |
以視覺伺服為基礎之全向移動機器人追蹤控制 Visual Servo Based Tracking Control of an Omnidirectional Mobile Robot |
| 指導教授: |
何明字
Ho, Ming-Tzu |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 工程科學系 Department of Engineering Science |
| 論文出版年: | 2007 |
| 畢業學年度: | 95 |
| 語文別: | 中文 |
| 論文頁數: | 163 |
| 中文關鍵詞: | 視覺伺服 、數位訊號處理器 |
| 外文關鍵詞: | DSP, visual servo |
| 相關次數: | 點閱:118 下載:3 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本論文是以數位訊號處理器(digital signal processor, DSP)為平台,以影像處理方法為基礎,建構出即時影像追蹤系統並結合全向移動機器人以達成軌跡追蹤控制之目的。整個系統主要分成全向移動機器人、影像處理模組、無線傳輸模組、影像感測器和追蹤控制器部份。其中,影像處理模組部份主要是以Altera公司所生產的可程式邏輯閘陣列(field programmable gate array, FPGA)來完成;影像感測器則是採用原相科技(PixArt Technologies)所生產的CMOS黑白影像感測器PAS106BBB;而追蹤控制器部份是以德州儀器(Texas Instruments, TI)生產的數位訊號處理器TMS320F2812為控制核心。在實作上,利用CMOS影像感測器來擷取全向移動機器人的位置,再由FPGA處理相關的影像演算法,透過無線傳輸的方式傳送位置資訊給全向移動機器人,最後利用計算轉矩(computed torque)與PID追蹤控制器來完成全向移動機器人的追蹤控制。經過驗證與分析後,本論文證實了演算法的可行性,並完成了以視覺為基礎之全向移動機器人追蹤控制。
This thesis use the digital signal processor and image processing methods to construct a real-time visual-servo tracking system for an omnidirectional mobile robot to achieve the trajectory tracking control. The system consists of an omnidirectional mobile robot with three motor actuators, image processing module, two image sensors, wireless transmission modules, and tracking controller. In this system, the image processing module is implemented by an FPGA device (EP1C12Q240C6, Altera). The CMOS image sensors from PixArt Technologies are used for detecting the position of the omnidirectional mobile robot. Besides, the tracking controller is implemented by a digital signal processor (TMS320F2812, TI). In the experiment, CMOS image sensors sense the position of the omnidirectional mobile robot, and then the image processing algorithm is executed by the FPGA device, and position information is sent to the omnidirectional mobile robot using wireless transmission. Finally, the trajectory tracking control is done by the computed torque and PID controller. The theoretical development is validated through the simulation and experiment.
[1] D. J. Dailey, F. W. Cathey, and S. Pumrin, “An Algorithm to Estimate Mean Traffic Speed Using Uncalibrated Cameras,” IEEE Transactions on Intelligent Transportation Systems, Vol. 1, No. 2, pp. 98-107, 2000.
[2] A. J. Lipton, H. Fujiyoshi, and R. S. Patil, “Moving Target Classification and Tracking from Real-time Video,” IEEE Workshop on Applications of Computer Vision, pp. 8-14, 1998.
[3] D. S. Jang, G. Y. Kim, and H. I. Choi, “Model-based Tracking of Moving Object,” Pattern Recognition, Vol. 30, No. 6, pp. 999-1008, 1997.
[4] G. D. Hager, W. C. Chang, and A. S. Morse, “Robot Feedback Control Based on Stereo Vision: Towards Calibration-free Hand-eye Coordination,” Proceedings of IEEE International Conference on Roborics and Automation, Vol. 4, pp. 2850-2856, 1994.
[5] K. P. Horn, and B. G. Schunck, “Determining Optical Flow,” Artificial Intelligence,Vol.17, pp. 185-203, 1981.
[6] T. Pun, “Entropic Thresholding: A New Approach,” Computer Graphics Image Process, Vol. 16, pp. 210-239, 1981.
[7] J. N. Kapur, P. K. Sahoo, and A. K. C. Wong, ”A New Method For Gray-level Picture Thresholding Using the Entropy of the Histogram,” Computer Vision Graphics Image Process, Vol. 29, pp. 273-285, 1985.
[8] Y. Matsumoto, M. Inaba, and H. Inoue, “Visual Navigation Using View-Sequenced Route Representation,” Proceedings. IEEE International Conference on Robotics and Automation, Vol. 1, pp. 83-88, 1996.
[9] S. Thrun, M. Bennewitz, W. Burgard, A.B. CreN. Roy, J. Schulte, and D. Schulz, “MINERVA: A Second Generation Mobile Tour-guide Robot,” Proceedings of IEEE International Conference on Robotics and Automation, pp. 1999-2005, 1999.
[10] P. Saeedi, D. Lowe, and P. Lawrence, “3D Localization and Tracking in Unknown Environments,” Proceedings of IEEE International Conference on Robotics and Automation, pp.1297-1303, Taipei, Taiwan, 2003.
[11] E. Kruse and F.M. Wahl, “Camera-based Observation of Obstacle Motions to Derive Statistical Data for Mobile Robot Motion Planning,” Proceedings of IEEE International Conference on Robotics and Automation, Vol. 1, pp. 662-667, 1998.
[12] G. Jang, S. Kim, J. Kim, and I. Kweon, “Metric Localization Using a Single Artificial Landmark for Indoor Mobile Robots,” IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.2857-2862, 2005.
[13] R. Cassinis, D. Grana, and A. Rizzi, “Using Colour Information in an Omnidirectional Perception System for Autonomous Robot Localization,” Proceedings of the First Euromicro Workshop on Advanced Mobile Robot, pp. 172-176, 1996.
[14] 楊志偉,「以視覺伺服為基礎之倒單擺系統平衡控制」,國立成功大學工程科學系碩士論文,民國九十四年。
[15] 林家民,「以視覺伺服為基礎之物體追蹤系統之設計與實現」,國立成功大學工程科學系碩士論文,民國九十五年。
[16] 繆紹綱,「數位影像處理」,普林斯頓國際有限公司,民國九十二年。
[17] L. G. Shapiro and G. C. Stockman, Computer Vision, Prentice Hall, Upper Saddle River, NY, 2001.
[18] PAS106BCB-283/PAS106BBB-283 CMOS Image sensor IC, PixArt Imaging Inc., 2002.
[19] N. Goncalves and H. Araujo, “Estimation of 3D Motion from Stereo Image-Uncertainty Analysis and Experimental Results,” Proceedings of the Intelligent Robots and Systems, Vol. 1, pp. 7-12, 2002.
[20] F. L. Lewis, C.T. Abdallah, D. M. Dawson, Control of Robot Manipulators, Macmillan Publishing Company, New York, 1993.
[21] 張碩,「自動控制」,鼎茂圖書出版股份有限公司,民國九十年。
[22] The -Bus Specification Version 2.1, Philips Semiconductors, 2000.
[23] 范逸之、陳立元,「Visual Basic與RS-232串列通訊控制」,文魁資訊股份有限公司,民國八十九年。
[24] IC61LV2568 256K 8 High-Speed CMOS Static RAM, ISSI Inc., 2000.
[25] 杜俊誼,「以能量為基礎之雙連桿倒單擺系統甩上平衡控制」,國立成功大學工程科學系碩士論文,民國九十五年。
[26] Cyclone Device Handbook, ALTERA Inc., 2002
[27] TMS320F2812 Digital Signal Processors Data Manual, Texas Instruments Inc., 2001.
[28] TMS320C6416T DSK Technical Reference, Spectrum Digital Inc., 2004
[29] 莊慧仁,「FPGA/CPLD 數位電路設計入門與實務應用」,全華科技圖書股份有限公司,民國九十五年。