簡易檢索 / 詳目顯示

研究生: 陳寬益
Chen, Kung-Yi
論文名稱: 即時視覺伺服追蹤系統之設計與實現
Design and Implementation of a Real-Time Visual Servo Tracking System
指導教授: 蔡明祺
Tsai, Mi-Ching
學位類別: 碩士
Master
系所名稱: 工學院 - 機械工程學系
Department of Mechanical Engineering
論文出版年: 2002
畢業學年度: 90
語文別: 中文
論文頁數: 67
中文關鍵詞: 即時控制背景補償視覺伺服影像追蹤
外文關鍵詞: background compensation, real-time control, image tracking, visual servo
相關次數: 點閱:76下載:2
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在影像光流的估測方法中,運動能量法為一種廣被使用的追蹤法,然此法並不適合應用於主動式攝影機,此問題雖可透過背景補償法解決,然而此法之缺點即是當目標物移動速度太慢時,其無法被偵測出來,導致無法有效完成追蹤任務,因此本文嘗試結合視訊壓縮技術領域中所使用的三步搜尋法(3-step searching method),首先利用運動能量法在壓縮過之影像中進行粗搜尋,接著再以三步搜尋法在原始影像中相對應位置附近作細搜尋,得到精確的目標物位置。另外在伺服部分,將探討所選用之不同馬達伺服驅動模式,對視覺回授控制器所造成之影響,並且以穩態響應中常用之標準測試訊號,包括步階與斜坡命令輸入,來測試視覺伺服系統對其命令之穩態誤差,並嘗試以前饋補償架構來消除二階輸入命令之穩態誤差,使其達到更佳之追蹤效果。

    In this study, a prototype real-time visual servo tracking system is developed. It consists of two parts ¾ image-processing unit and servo control unit. For the image-processing unit, the background compensation technique is used to align the images taken at different positions so that static motion energy method can be applied. However, inaccurate image mapping due to noise is inevitable. Thus to desensitize the noise effect, the morphological filter is employed in this work. Nevertheless, for the cases of slowly moving targets, it is possible that the targets will not be detected. The reason is that the morphological filter will remove any insignificant moving portion in the image. To overcome this difficulty, an approach that combining the motion energy method and the 3-step hierarchical search method ¾ a fast motion estimation algorithm for video coding, is proposed in this study. Experimental results show that the proposed approach can lead to a good motion tracking result. For the servo control unit, it is found that the mode of servo drive will affect the type of visual servo system. Hence the visual controller should be designed properly so that steady state tracking error will satisfy the performance criterion. In addition, to eliminate the tracking error corresponding to the ramp input, the feed-forward compensation is incorporated into the control structure. Experiment results indicate that the feed-forward compensation can indeed reduce the tracking error.

    中文摘要 I 英文摘要 II 誌謝 III 目錄 IV 表目錄 VI 圖目錄 VII 第一章 緒論 1 1.1 研究動機與目的 1 1.2 文獻回顧 3 1.3 本文架構 6 第二章 應用背景補償之運動能量法 7 2.1 攝影機模型 7 2.2 運動能量法 10 2.3 背景補償法 15 2.4 型態學濾波器 17 第三章 改良式運動能量法 21 3.1 背景補償法之誤差分析 22 3.2 追蹤速度限制 28 3.3 區域比對法 30 3.4 結合SSD法之改良式運動能量法 34 第四章 影像伺服控制架構 39 4.1 影像伺服系統建模 39 4.2 伺服控制架構 41 4.3 前饋補償控制 51 第五章 實驗設備與系統架構 54 5.1 軟硬體設備概述 54 5.2 馬達系統鑑別 58 5.3 攝影機校正 60 第六章 結論與建議 63 參考文獻 65

    [1]C. Cedras and M. Shah, “Motion-based recognition: A survey,” Image and Vision Computing, Vol. 13, pp. 129-154, 1995.
    [2]J. Feddema and O. Mitchell, “Vision-guided servoing with feature-based trajectory generation,” IEEE Trans. on Robotics and Automation, Vol. 5, pp. 691-700, 1989.
    [3]N. P. Papanikolopoulos, P. K. Khosla , and T. Kanade, “Visual Tracking of a Moving Target by a Camera Mounted on a Robot: A Combination of Control and Vision,” IEEE Trans. on Robotics and Automation, Vol. 9, No. 1, pp. 14-35, 1993.
    [4]J. Gangloff, M.d. Mathelin and A. Gabriel, “6 DOF high speed dynamic visual servoing using GPC controllers,” Proceedings of the 1998 IEEE International Conference on Robotics and Automation, pp. 2008-2013, Leuven, Belgium, 1998.
    [5]R. Kelly, R. Carelli, O. Nasisi, B. Kuchen and F. Reyes, “Stable Visual Servoing of Camera-in-Hand Robotic Systems,” IEEE/ ASME Trans. on Mechatronics, Vol.5, No.1, pp. 39-48, 2000.
    [6]J. Stavnitzky, and D. Capson, “Multiple Camera Model-Based 3-D Visual Servo,” IEEE Trans. on Robotics and Automation, Vol.16, No.6, pp. 732-739, 2000.
    [7]S. Hutchinson, G. D. Hager, and P. I. Corke, “A Tutorial on Visual Servo Control,” IEEE Trans. on Robotics and Automation, Vol. 12, No.5, pp.651-670, 1996.
    [8]J. L. Barron, D. J. Fleet, S. S Beauchemin, and T.A Burkitt, “Performance of Optical Flow Techniques,” Proceedings of 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 236-242, 1992.
    [9]L. E. Weiss, A. C. Sanderson, and C. P. Neuman, “Dynamic Sensor-Based Control of Robots with Visual Feedback,” IEEE Trans. on Robotics and Automation, Vol. RA-3, No. 5, pp. 404-417, 1987.
    [10]B. K. P. Horn and B. G. Schunck, “Determine Optical Flow,” Artificial Intelligence, Vol. 17, pp. 285-204, 1981.
    [11]B. Lucas and T. Kanade, “An iterative image Registration Technique with an Application to Stereo Vision,” Proceedings of DARPA Image Understanding Workshop, pp. l21-130, 1981.
    [12]P. Anandan, Measuring Visual Motion from Image Sequences, COINS Dept. Univ. of Massachusetts, Tech. Rep. COINS- TR-87-21, 1987.
    [13]H. M. Jong, L. G. Chen and T. D. Chiueh, “Parallel Architectures for 3-Step Hierarchical Search Block - Matching Algorithm,” IEEE Trans. on Circuits and Systems for Video Technology, Vol. 4, No. 4, pp. 407-416, 1994.
    [14]D. Murray and A. Basu, “Motion Tracking with an Active Camera,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol. 16, No. 5, pp. 449-459,1994.
    [15]R. Yang and Z. Zhang, “Model-Based Head Pose Tracking With Stereovision,” IEEE International Conference on Automatic and Gesture Recognition, Vol. 20-21, pp. 242-247,2002.
    [16]D. Kragic and H. I. Christensen, “Cue Integration for Visual Servoing,” IEEE Trans. on Robotics and Automation, Vol. 17, No. 1, pp. 18-27, 2001.
    [17]N. Papanikolopoulos, P. K. Khosla and T. Kanade, “Vision and Control Techniques for Robotic Visual Tracking,” Proceedings of the IEEE International Conference on Robotics and Automation, pp. 857-864, 1991.
    [18]P. I. Corke and M. C. Good, “Dynamic Effects in High-Performance Visual Servoing,” Proceedings of the IEEE International Conference on Robotics and Automation, pp. 1838-1843, 1992.
    [19]N. P. Papanikolopoulos and P. K. Khosla, “Adaptive Robotic Visual Tracking: Theory and Experiments,” IEEE Trans. on Automatic Control, Vol. 38, No. 3, pp. 429-445, 1993.
    [20]P. I. Corke and M. C. Good, “Dynamic Effects in Visual-Loop Systems,” IEEE Trans. on Robotics and Automation, Vol. 12, No. 5, pp.671-683, 1996.
    [21]P. Y. Oh and P. K. Allen, “Visual Servoing by Partitioning Degrees of Freedom,” IEEE Trans. on Robotics and Automation, Vol. 17, No. 1, pp. 1-17, 2001.
    [22]R. Y. Tsai, “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf Cameras and Lenses,” IEEE Trans. on Robotics and Automation, Vol. 3, No. 4, pp. 323-344, 1987.
    [23]Z. Zhang, “Flexible Camera Calibration By Viewing a Plane From Unknown Orientations,” Proceedings of the IEEE International Conference on Computer Vision, Vol. 1, pp. 666-673, 1999.
    [24]H. H. Nagel, “Formation of an Object Concept by Analysis of Systematic Time Variations in the Optically Perceptible Environment,” Computer Graphic and Image Processing, Vol. 7, pp. 149-194, 1978.
    [25]D. V. Papadimitriou and T. J. Dennis, “Epipolar Line Estimation and Retification for Stereo Image Pairs,” IEEE Trans. on Image Processing, Vol. 5, No. 4, 1996.
    [26]W. J. Jong, A Real-Time Image Tracking System Based on Optical Flow Computation, Master Thesis, Dept. of Elec., NCKU, Taiwan, 2001.
    [27]R. Li, B. Zeng and M. Liou, “A New Three-Step Search Algorithm for Block Motion Estimation,” IEEE Trans. on Circuits and Systems for Video Technology, Vol. 4, No. 4, 1994.
    [28]L. M. Po and W. C. Ma, “A Novel Four-Step Search Algorithm for Fast Block Motion Estimation,” IEEE Trans. on Circuits and Systems for Video Technology, Vol. 6, No. 3, 1996.
    [29]A. J. Lipton, H. Fujiyoshi and R. S. Patil, “Moving Target Classification and Tracking from Real-Time Video,” Proceedings of the Fourth IEEE Workshop on Application of Computer Vision, pp. 8-14, 1998.

    下載圖示 校內:2003-07-31公開
    校外:2003-07-31公開
    QR CODE