| 研究生: |
王翊 Wang, Yi |
|---|---|
| 論文名稱: |
以影像為基礎之拋體軌跡預測及攔接 Vision-based Projectile Trajectory Prediction and Projectile Catching |
| 指導教授: |
何明字
Ho, Ming-Tzu |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 工程科學系 Department of Engineering Science |
| 論文出版年: | 2015 |
| 畢業學年度: | 103 |
| 語文別: | 中文 |
| 論文頁數: | 167 |
| 中文關鍵詞: | 視覺追蹤系統 、全向移動機器人 、卡門濾波器 、落點估測 |
| 外文關鍵詞: | active stereo vision system, omnidirectional mobile robot, Kalman filter |
| 相關次數: | 點閱:150 下載:3 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本論文旨在建構多相機視覺追蹤系統,並以該系統導引全向移動機器人攔接拋體。視覺追蹤系統中,利用兩個影像感測器模擬人類雙眼,依據拋體的色彩資訊進行視覺追蹤,將拋體座標經卡門濾波估測落點後,透過無線傳輸模組導引全向移動機器人至落點位置,同時藉由安裝在天花板上的相機將車體位置回授給全向移動機器人做控制。整個系統主要分成全向移動機器人、全景搖攝機構、影像處理模組、數位訊號處理模組和馬達控制器。由於利用拋體的色彩資訊作為目標物判斷依據,故使得本系統可以於複雜背景中進行攔接控制,且透過卡門濾波器估算出軌跡後,可使視覺追蹤系統的追蹤過程更為流暢。本論文透過模擬與實作分析並驗證整體系統之可行性,視覺追蹤系統確實可以將落點資訊估算出來,並由全向移動機器人攔接拋體。
The main purpose of this thesis is to study a multi-camera visual tracking system, which guides an omnidirectional mobile robot to catch a projectile. In the vision tracking system, two image sensors are used to provide stereo vision. The vision system is able to track the throwing projectile according to it’s color information. After the position of the projectile has been determined by Kalman filter, the omnidirectional mobile robot moves to the point of fall which is sent to the robot via a wireless communication module. Through simulation and experiments, this thesis has shown the feasibility of the designed system. The projectile can be truly tracked by the active stereo vision system, and the omnidirectional mobile robot can also catch the projectile.
[1] L.G. Shapiro and G.C. Stockman, Computer Vision, Prentice Hall, Upper Saddle River, NY, 2001.
[2] 簡彰億,「以DSP為基礎於複雜背景中之視覺引導全向移動機器人之研製」,國立成功大學工程科學系碩士論文,民國九十九年七月。
[3] S. Tsugawa, “Vision-Based Vehicles in Japan: Machine Vision Systems and Driving Control Systems,” IEEE Transactions on Industrial Electronics, vol. 41, no. 4, pp. 398-405, 1994.
[4] M. Bertozzi and A. Broggi, “Vision-Based Vehicle Guidance,” IEEE Computer Society, vol. 30, no. 7, pp. 49-55, 1997.
[5] J. L. Barron, D. J. Fleet and S. S. Beachemin, “Performance of Optical Flow Techniques,” International Journal of Computer Vision, pp. 43-77, 1994.
[6] K. P. Horn and G. Schunk, “Determine Optical Flow,” Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge.
[7] D. Lucas and T. Kanade, “An Iterative Image Registration Technique with an Application to Stereo Vision,’’ Computer Science Department, pp.674-679, 1981
[8] K. Watanabe, Y. Shiraishi, S. G. Tzafestas, J. Tang, and T. Fukuda, “Feedback Control of an Omnidirectional Autonomous Platform for Mobile Service Robots,” Journal of Intelligent and Robotic Systems, vol. 22, no. 3, pp. 315-330, 1998.
[9] R. Balakrishna and A. Ghosal, “Modeling Slip for Wheeled Mobile Robots,” IEEE Transactions on Robotics and Automation, vol. 11, no. 1, pp. 126-132, 1995.
[10] T. Kalmar-Nagy, R. D’Andrea, and P. Ganguly, “Near-optimal Dynamic Trajectory Generation and Control of an Omnidirectional Vehicle,” Robotics and Autonomous Systems, vol. 46, pp. 47-64, 2004.
[11] J. Wu, R. L. Willians, and J. Lew, “Velocity and Acceleration Cones for Kinematic and Dynamic Constraints on Omnidirectional Mobile Robots,” ASME Journal of Dynamic Systems, Measurement and Control, vol. 128, no. 4, pp. 788-799, 2006.
[12] B. Hove and J. Slotine, “Experiments in Robotic Catching,” Proceedings of the American Control Conference, pp. 380-385, 1991.
[13] A. Namiki and M. Ishikawa, “Robotic Catching Using a Direct Mapping from Visual Information to Motor Command,” Proceedings of the International Conference on Robotics and Automation, vol. 2, pp. 2400 - 2405, 2003.
[14] K. Deguchi, H. Sakurai, and S. Ushida, “A Goal Oriented Just-In-Time Visual Servoing for Ball Catching Robot Arm,” Proceedings of the International Conference on Intelligent Robots and Systems, pp. 3034-3039, 2008.
[15] B. Bauml, T. Wimbock, and G. Hirzinger, “Kinematically Optimal Catching a Flying Ball with a Hand-Arm-System,” Proceedings of the International Conference on Intelligent Robots and Systems, pp. 2592-2599, 2010.
[16] B. Bauml, O. Birbach, T. Wimbock, and U. Frese, “Catching Flying Balls with a Mobile Humanoid: System Overview and Design Considerations,” Proceedings of the International Conference on Humanoid Robots, pp. 513 -520, 2011.
[17] O. Birbach, U. Frese, and B. Bauml, “Realtime Perception for Catching a Flying Ball with a Mobile Humanoid,” Proceedings of the International Conference on Robotics and Automation, pp. 5955-5962, 2011.
[18] U. Frese, B. Bauml, S. Haidacher, G. Schreiber, I. Schaefer, M. Hahnle, and G. Hirzinger, “Off-the-Shelf Vision for a Robotic Ball Catcher,” Proceedings of the International Conference on Intelligent Robots and Systems, vol.3, pp. 1623-1629, 2001.
[19] V. Lippiello and F. Ruggiero, “Monocular Eye-in-Hand Robotic Ball Catching with Parabolic Motion Estimation,” Proceedings of the 10th IFAC Symposium on Robot Control International Federation of Automatic Control, pp. 229-235, 2012.
[20] J.A. Borgstadt and N. J. Ferrier, “Interception of a Projectile Using a Human Vision-Based Strategy,” Proceedings of the International Conference on Robotics and Automation, vol. 4, pp. 3189-3196, 2000.
[21] F. Miyazaki and R. Mori, “Realization of Ball Catching Task using a Mobile Robot,” Proceedings of the International Conference on Networking, Sensing & Control, pp. 58-63, 2004.
[22] T. G. Sugar, M. K. McBeath, A. Suluh, and K. Mundhra, “Mobile robot interception using human navigational principles: Comparison of active versus passive tracking algorithms,” Autonomous Robots, vol. 21, no. 1, pp. 43-54, 2006.
[23] M. Bratt, C. Smith, and H. I. Christensen, “Minimum Jerk Based Prediction of User Actions for a Ball Catching Task,” Proceedings of the International Conference on Intelligent Robots and Systems, pp. 2710-2716, 2007.
[24] 曾鏡全,「A Kinect-based Interactive Robot System with 3D Vision」,國立臺灣海洋大學電機工程學系碩士論文,民國一零一年七月。
[25] C.Y. Lin, Y.P. Chiua, C.Y. Lin, and C.R. Tsai, “Development of a Binocular Vision-based Catcher Robot System using DSP Platform,” Journal of the Chinese Institute of Engineers, vol. 37, no. 2, pp. 210-223, 2014.
[26] J.S. Hu, M.C. Chien, Y.J. Chang, Y.C. Chang, S.H. Su, J.J. Yang, and C.Y. Kai, “A Robotic Ball Catcher with Embedded Visual Servo Processor,” Proceedings of International Conference on Intelligent Robots and Systems, pp. 2513-2514, 2010.
[27] H. C. Huang and C. C. Tsai, “Simultaneous Tracking and Stabilization of an Omnidirectional Mobile Robot in Polar Coordinates: a Unified Control Approach,” Robotica, vol. 27, no.3, pp. 447-458, 2009.
[28] H. C. Huang and C. C. Tsai, “FPGA Implementation of an Embedded Robust Adaptive Controller for Autonomous Omnidirectional Mobile Platform,” IEEE Transactions on Industrial Electronics, vol. 56, no. 5, pp. 1604-1616, 2009.
[29] 楊宗諭,「以顏色為基礎之多相機追蹤控制系統設計與實現」,國立成功大學工程科學系碩士論文,民國一○一年七月。
[30] 林柏瑋,「以視覺為基礎之擊球控制系統之設計與實現」,國立成功大學工程科學系碩士論文,民國九十九年七月。
[31] Color – space, Wikipedia, t
http://en.wikipedia.org/wiki/Color_space
[32] R. E. Kalman, “A New Approach to Linear Filtering and Prediction Problems,’’ Journal of Basic Engineering, pp. 35-45, 1960,
[33] Stokes' law, Wikipedia,
https://en.wikipedia.org/wiki/Stokes%27_law
[34] 林家民,「以視覺伺服為基礎之物體追蹤系統系統之設計與實現」,國立成功大學工程科學系碩士論文,民國九十五年七月。
[35] 1/2.5-Inch 5Mp CMOS Digital Image sensor MT9P001 datasheet.
[36] Texas Instrument TMS320DM6437 Digital Media Processor datasheet.
[37] Bogatin, Eric, Signal integrity : simplified / Eric Bogatin, Prentice Hall, Upper Saddle River, NJ, 2004
[38] Texas Instrument TMS320F2812 Digital Signal Processor datasheet.
[39] A3941 Automotive Full Bridge MOSFET Driver, Allegro Inc., 2008.