簡易檢索 / 詳目顯示

研究生: 王翊
Wang, Yi
論文名稱: 以影像為基礎之拋體軌跡預測及攔接
Vision-based Projectile Trajectory Prediction and Projectile Catching
指導教授: 何明字
Ho, Ming-Tzu
學位類別: 碩士
Master
系所名稱: 工學院 - 工程科學系
Department of Engineering Science
論文出版年: 2015
畢業學年度: 103
語文別: 中文
論文頁數: 167
中文關鍵詞: 視覺追蹤系統全向移動機器人卡門濾波器落點估測
外文關鍵詞: active stereo vision system, omnidirectional mobile robot, Kalman filter
相關次數: 點閱:150下載:3
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本論文旨在建構多相機視覺追蹤系統,並以該系統導引全向移動機器人攔接拋體。視覺追蹤系統中,利用兩個影像感測器模擬人類雙眼,依據拋體的色彩資訊進行視覺追蹤,將拋體座標經卡門濾波估測落點後,透過無線傳輸模組導引全向移動機器人至落點位置,同時藉由安裝在天花板上的相機將車體位置回授給全向移動機器人做控制。整個系統主要分成全向移動機器人、全景搖攝機構、影像處理模組、數位訊號處理模組和馬達控制器。由於利用拋體的色彩資訊作為目標物判斷依據,故使得本系統可以於複雜背景中進行攔接控制,且透過卡門濾波器估算出軌跡後,可使視覺追蹤系統的追蹤過程更為流暢。本論文透過模擬與實作分析並驗證整體系統之可行性,視覺追蹤系統確實可以將落點資訊估算出來,並由全向移動機器人攔接拋體。

    The main purpose of this thesis is to study a multi-camera visual tracking system, which guides an omnidirectional mobile robot to catch a projectile. In the vision tracking system, two image sensors are used to provide stereo vision. The vision system is able to track the throwing projectile according to it’s color information. After the position of the projectile has been determined by Kalman filter, the omnidirectional mobile robot moves to the point of fall which is sent to the robot via a wireless communication module. Through simulation and experiments, this thesis has shown the feasibility of the designed system. The projectile can be truly tracked by the active stereo vision system, and the omnidirectional mobile robot can also catch the projectile.

    摘要 I Extend Abstract II 誌謝 VIII 目錄 IX 圖目錄 XIII 表目錄 XVIII 第一章 緒論 1-1 研究背景 1-1 1-2 研究動機及目的 1-1 1-3 研究步驟 1-3 1-4 相關文獻回顧 1-5 1-5 論文結構 1-7 第二章 相機模型與參數校正 2-1 前言 2-1 2-2 針孔成相模型 2-1 2-3 座標旋轉與平移 2-4 2-4 相機參數校正 2-8 2-5 多相機空間幾何與物體座標計算 2-23 第三章 數位影像處理 3-1 前言 3-1 3-2 彩色空間模型 3-1 3-2-1 RGB彩色空間模型 3-2 3-2-2 YUV彩色空間模型 3-3 3-2-3 彩色空間轉換 3-6 3-3 移動目標物偵測 3-7 3-3-1 影像型態學 3-8 3-3-2 目標物重心計算 3-9 第四章 質點拋體軌跡預測 4-1 前言 4-1 4-2 卡門濾波器 4-1 4-3 質點拋體運動學 4-8 第五章 系統數學模型與控制器設計 5-1 前言 5-1 5-2 視覺追蹤系統數學模型 5-1 5-3 全向移動機器人機構部份動態模型之建立 5-6 5-4 永磁式直流馬達數學模型與參數識別 5-9 5-5 全向移動機器人整體系統數學模型 5-20 5-6 軌跡追蹤控制器設計 5-21 5-6-1 狀態回授線性化控制器設計 5-21 5-6-2 PID控制器設計 5-24 第六章 系統軟硬體架構 6-1 前言 6-1 6-2 整體系統架構 6-1 6-3 具無線傳輸之全向移動機器人 6-4 6-3-1 RF無線傳輸模組 6-4 6-3-2 電源模組 6-5 6-4 全景搖攝平台 6-7 6-5 DM6437影像處理模組 6-9 6-6 FPGA數位邏輯模組 6-14 6-7 DSP數位訊號處理控制模組 6-16 6-8 PWM馬達驅動模組 6-18 第七章 實驗結果 7-1 前言 7-1 7-2 模擬結果 7-1 7-3 視覺追蹤實驗結果 7-11 7-3-1 自由落體視覺追蹤實驗 7-11 7-3-2 質點拋體視覺追蹤實驗 7-16 7-3-3 拋體落點預測結合卡門濾波器實驗 7-21 7-4 全向移動機器人之落點引導實驗結果 7-28 7-5 拋體攔接實驗結果 7-31 第八章 結論與未來展望 8-1 結論 8-1 8-2 未來展望 8-1 參考文獻 Ref-1

    [1] L.G. Shapiro and G.C. Stockman, Computer Vision, Prentice Hall, Upper Saddle River, NY, 2001.
    [2] 簡彰億,「以DSP為基礎於複雜背景中之視覺引導全向移動機器人之研製」,國立成功大學工程科學系碩士論文,民國九十九年七月。
    [3] S. Tsugawa, “Vision-Based Vehicles in Japan: Machine Vision Systems and Driving Control Systems,” IEEE Transactions on Industrial Electronics, vol. 41, no. 4, pp. 398-405, 1994.
    [4] M. Bertozzi and A. Broggi, “Vision-Based Vehicle Guidance,” IEEE Computer Society, vol. 30, no. 7, pp. 49-55, 1997.
    [5] J. L. Barron, D. J. Fleet and S. S. Beachemin, “Performance of Optical Flow Techniques,” International Journal of Computer Vision, pp. 43-77, 1994.
    [6] K. P. Horn and G. Schunk, “Determine Optical Flow,” Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge.
    [7] D. Lucas and T. Kanade, “An Iterative Image Registration Technique with an Application to Stereo Vision,’’ Computer Science Department, pp.674-679, 1981
    [8] K. Watanabe, Y. Shiraishi, S. G. Tzafestas, J. Tang, and T. Fukuda, “Feedback Control of an Omnidirectional Autonomous Platform for Mobile Service Robots,” Journal of Intelligent and Robotic Systems, vol. 22, no. 3, pp. 315-330, 1998.
    [9] R. Balakrishna and A. Ghosal, “Modeling Slip for Wheeled Mobile Robots,” IEEE Transactions on Robotics and Automation, vol. 11, no. 1, pp. 126-132, 1995.
    [10] T. Kalmar-Nagy, R. D’Andrea, and P. Ganguly, “Near-optimal Dynamic Trajectory Generation and Control of an Omnidirectional Vehicle,” Robotics and Autonomous Systems, vol. 46, pp. 47-64, 2004.
    [11] J. Wu, R. L. Willians, and J. Lew, “Velocity and Acceleration Cones for Kinematic and Dynamic Constraints on Omnidirectional Mobile Robots,” ASME Journal of Dynamic Systems, Measurement and Control, vol. 128, no. 4, pp. 788-799, 2006.
    [12] B. Hove and J. Slotine, “Experiments in Robotic Catching,” Proceedings of the American Control Conference, pp. 380-385, 1991.
    [13] A. Namiki and M. Ishikawa, “Robotic Catching Using a Direct Mapping from Visual Information to Motor Command,” Proceedings of the International Conference on Robotics and Automation, vol. 2, pp. 2400 - 2405, 2003.
    [14] K. Deguchi, H. Sakurai, and S. Ushida, “A Goal Oriented Just-In-Time Visual Servoing for Ball Catching Robot Arm,” Proceedings of the International Conference on Intelligent Robots and Systems, pp. 3034-3039, 2008.
    [15] B. Bauml, T. Wimbock, and G. Hirzinger, “Kinematically Optimal Catching a Flying Ball with a Hand-Arm-System,” Proceedings of the International Conference on Intelligent Robots and Systems, pp. 2592-2599, 2010.
    [16] B. Bauml, O. Birbach, T. Wimbock, and U. Frese, “Catching Flying Balls with a Mobile Humanoid: System Overview and Design Considerations,” Proceedings of the International Conference on Humanoid Robots, pp. 513 -520, 2011.
    [17] O. Birbach, U. Frese, and B. Bauml, “Realtime Perception for Catching a Flying Ball with a Mobile Humanoid,” Proceedings of the International Conference on Robotics and Automation, pp. 5955-5962, 2011.
    [18] U. Frese, B. Bauml, S. Haidacher, G. Schreiber, I. Schaefer, M. Hahnle, and G. Hirzinger, “Off-the-Shelf Vision for a Robotic Ball Catcher,” Proceedings of the International Conference on Intelligent Robots and Systems, vol.3, pp. 1623-1629, 2001.
    [19] V. Lippiello and F. Ruggiero, “Monocular Eye-in-Hand Robotic Ball Catching with Parabolic Motion Estimation,” Proceedings of the 10th IFAC Symposium on Robot Control International Federation of Automatic Control, pp. 229-235, 2012.
    [20] J.A. Borgstadt and N. J. Ferrier, “Interception of a Projectile Using a Human Vision-Based Strategy,” Proceedings of the International Conference on Robotics and Automation, vol. 4, pp. 3189-3196, 2000.
    [21] F. Miyazaki and R. Mori, “Realization of Ball Catching Task using a Mobile Robot,” Proceedings of the International Conference on Networking, Sensing & Control, pp. 58-63, 2004.
    [22] T. G. Sugar, M. K. McBeath, A. Suluh, and K. Mundhra, “Mobile robot interception using human navigational principles: Comparison of active versus passive tracking algorithms,” Autonomous Robots, vol. 21, no. 1, pp. 43-54, 2006.
    [23] M. Bratt, C. Smith, and H. I. Christensen, “Minimum Jerk Based Prediction of User Actions for a Ball Catching Task,” Proceedings of the International Conference on Intelligent Robots and Systems, pp. 2710-2716, 2007.
    [24] 曾鏡全,「A Kinect-based Interactive Robot System with 3D Vision」,國立臺灣海洋大學電機工程學系碩士論文,民國一零一年七月。
    [25] C.Y. Lin, Y.P. Chiua, C.Y. Lin, and C.R. Tsai, “Development of a Binocular Vision-based Catcher Robot System using DSP Platform,” Journal of the Chinese Institute of Engineers, vol. 37, no. 2, pp. 210-223, 2014.
    [26] J.S. Hu, M.C. Chien, Y.J. Chang, Y.C. Chang, S.H. Su, J.J. Yang, and C.Y. Kai, “A Robotic Ball Catcher with Embedded Visual Servo Processor,” Proceedings of International Conference on Intelligent Robots and Systems, pp. 2513-2514, 2010.
    [27] H. C. Huang and C. C. Tsai, “Simultaneous Tracking and Stabilization of an Omnidirectional Mobile Robot in Polar Coordinates: a Unified Control Approach,” Robotica, vol. 27, no.3, pp. 447-458, 2009.
    [28] H. C. Huang and C. C. Tsai, “FPGA Implementation of an Embedded Robust Adaptive Controller for Autonomous Omnidirectional Mobile Platform,” IEEE Transactions on Industrial Electronics, vol. 56, no. 5, pp. 1604-1616, 2009.
    [29] 楊宗諭,「以顏色為基礎之多相機追蹤控制系統設計與實現」,國立成功大學工程科學系碩士論文,民國一○一年七月。
    [30] 林柏瑋,「以視覺為基礎之擊球控制系統之設計與實現」,國立成功大學工程科學系碩士論文,民國九十九年七月。
    [31] Color – space, Wikipedia, t
    http://en.wikipedia.org/wiki/Color_space
    [32] R. E. Kalman, “A New Approach to Linear Filtering and Prediction Problems,’’ Journal of Basic Engineering, pp. 35-45, 1960,
    [33] Stokes' law, Wikipedia,
    https://en.wikipedia.org/wiki/Stokes%27_law
    [34] 林家民,「以視覺伺服為基礎之物體追蹤系統系統之設計與實現」,國立成功大學工程科學系碩士論文,民國九十五年七月。
    [35] 1/2.5-Inch 5Mp CMOS Digital Image sensor MT9P001 datasheet.
    [36] Texas Instrument TMS320DM6437 Digital Media Processor datasheet.
    [37] Bogatin, Eric, Signal integrity : simplified / Eric Bogatin, Prentice Hall, Upper Saddle River, NJ, 2004
    [38] Texas Instrument TMS320F2812 Digital Signal Processor datasheet.
    [39] A3941 Automotive Full Bridge MOSFET Driver, Allegro Inc., 2008.

    下載圖示 校內:2020-09-02公開
    校外:2022-01-01公開
    QR CODE