| 研究生: |
王國倫 Wang, Guo-Lun |
|---|---|
| 論文名稱: |
雙眼機械頭與機械手臂之手眼協調控制 Hand-Eye Coordination of the Robotic Binocular Head and Manipilator |
| 指導教授: |
蔡清元
Tsay, Tsing-Iuan |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 機械工程學系 Department of Mechanical Engineering |
| 論文出版年: | 2002 |
| 畢業學年度: | 90 |
| 語文別: | 英文 |
| 論文頁數: | 108 |
| 中文關鍵詞: | 頭手協調 、主動式視覺 、視覺伺服 、末端閉迴路 、離線校正 |
| 外文關鍵詞: | hand-eye coordination, active vision, visual servo, endpoint closed loop, off-line calibration |
| 相關次數: | 點閱:118 下載:3 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
機器人系統的應用主要是在不變的工廠環境中執行固定且重複的程序或是危險性高的工作,近幾年,為了讓機器人能適當反應工作環境的變化,並且快速適應新工作,結合感測器的機器人系統逐漸被發展建立。主動式視覺即透過一個架有一部或兩部攝影機的影像伺服機構來模仿人類視覺行為,使得非接觸式視覺感測系統更具實用性。
本文目的即在提出一個整合雙眼機械頭與機械手臂之機器人系統,混合以影像與位置為基礎的看而後動控制架構與策略,完成辨識與夾取目標物之工作任務。雙眼機械頭以兩個影像為基礎的看而後動控制策略搜尋並鎖定目標物;機械手臂則以三個位置為基礎的看而後動控制策略,先移動端接器至視野中,透過末端閉迴路的控制架構,接近並夾取目標物。機器人系統的應用需要事先完成兩個離線校正,一個是攝影機內部參數校正,另一個為移動端接器至視野中所需要的頭與手臂校正。所提出的影像處理方法增加了快速偵測目標物與機械手臂端接器位置的感知能力,並減少影像處理的計算時間。
最後,以三個實驗組合驗證理論的可行性與系統的性能,將正方體、直角三角柱和球三個物體置放於平台上,分別選擇其中一個做為實驗組合的目標物,實驗中目標物分別放置在相同位置但不同方向,雙眼機械頭與機械手臂辨識及搜尋目標物位置,並完成夾取工作。
Robot manipulators are mainly used to execute fixed and repetitive procedures or dangerous tasks in factories with static environments. In recent years, sensor-based robotic systems have been developed, to react appropriately to sudden environmental changes and adapt themselves quickly to new tasks. An active vision system is an effective sensory system, because it mimics human vision and allows for non-contact measurement of the environment using a servomechanism and one or two cameras.
This thesis proposes a coordinated control structure and a set of control strategies for an integrated robotic system, composed of a robotic binocular head and a robot manipulator, to recognize a target object and perform a grasping task. The proposed hand-eye coordination control structure for the integrated robotic system is a hybrid image-based/position-based look-and-move structure. Two image-based look-and-move control strategies are presented to enable the robotic head to saccade and fixate a target object. Three position-based look-and-move control strategies are presented for the manipulator to drive the end-effector into the field of view so that an endpoint closed-loop (ECL) system is formed, to enable the manipulator to approach the target object, and grasp it. Two off-line calibrations are required. One is the calibration of the camera’s intrinsic parameters, and the other is the proposed head/manipulator calibration, required by the control strategy for driving the end-effector into the field of view. Image processing algorithms are also proposed to increase the perception capability of rapidly detecting the location of a target object and the end-effector of the manipulator, such that the computation time for image processing is reduced.
Finally, three sets of experiments are conducted to verify the theoretical derivations and the performance of the system. Three objects, including a cube, a right-triangle pillar and a ball, are placed on a table, and one is selected as the target object in each set of experiments. The robotic head and manipulator coordinate to locate and grasp the target object, which is in the same position but at a different orientation in each experiment.
[1]. Y. I. Abdel-Aziz , H. M. Karara, “Direct Linear Transformation into Object Space Coordinate in Close-Range Photogrammetry,” ASP Symposium on Close-Range Photogrammetry, USA, pp.1-18, 1971.
[2]. J. M. Chiu, Z. Chen and C. M. Wang, “3-D Polyhedral Face Computation from Two Perspective Views with the Aid of a Calibration Plate, ” IEEE Transactions on robotics and automation, VOL. 13, No. 2, pp.290-295, 1997.
[3]. H. I. Christensen, “A Low-Cost Robot Camera Head,” Int. J. Pattern Recognition Artificial Intelligence, pp.69-87, 1993.
[4]. P. I. Corke, S. A. Hutchinson, “Real-Time Vision, Tracking and Control,” IEEE International Conference on Robotics & Automation, pp.622-629, 2000.
[5]. J. Crowley, P. Bobet and M. Mesrabi, “Layered Control of a Binocular Camera Head, ” Int. J. Pattern Recognition Artificial Intelligence, pp.109-122, 1993.
[6]. N. Ferrier, “Harvard Binocular Head,” in Application of Artificial Intelligence X: Machine Vision and Robotics, Vol. 1708, pp.2-13, 1992.
[7]. G. D. Hager, W. C. Chang and A. S. Morse, “Robot Hand-Eye Coordination Based on Stereo Vision” IEEE Control System Magazine, Vol. 15, pp. 30-39, 1995.
[8]. K. Hashimoto, T. Kimoto, T. Ebine and H. Kimura, “Manipulator Control with Image-Based Visual Servo” International Conference on Robotic and Automation, pp.2267-2272, 1991.
[9]. W. Hong, “Robotic Catching and Manipulation Using Active vision,” M. S. Thesis, Department of Mechanical Engineering, MIT, Sep. 1995.
[10]. S. Hutchinson, G. D. Hager, P. I. Corke, ”A Tutorial on Visual Servo Control,” IEEE Transactions on Robotics and Automation, Vol.12, No.5, pp.651-670, 1996.
[11]. M. K. Lee, “Binocular Tracking with the Application of Ultrasonic Sensors,” M. S. Thesis, Department of Mechanical Engineering, NCKU, July 2001.
[12]. F. Mokhtarian and R. Suomela, “Robust Image Corner Detection Through Curvature Scale Space,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 20, No. 12, pp.1376-1381, 1998
[13]. K. Pahlavan and J. Eklundh, “Head, Eyes, and Head-Eye Systems,” in Applications of Artificial Intelligence X: Machine Vision and Robotic, Vol. 1708, pp.14-25, 1992.
[14]. D. V. Papadimitrion and T. J. Dennis, ” Epipolar Line Estimation and Rectification for Stereo Image Pairs,” IEEE Transaction on image processing, Vol 5, NO. 4 , pp.672-676,1996
[15]. J. A. Piepmeier, G. V. McMurray, H. Lipkin, “A Dynamic Quasi-Newton Method for Uncalibrated Visual Servoing,” IEEE, International Conference on Robotics & Automation, pp.1595-1600,1999.
[16]. A. C. Sanderson, L. E. Weiss, “Image-based Visual Servo Control Using Relational Graph Error Signals,” Proceeding of IEEE, pp.1074-1077, 1980.
[17]. P. M. Sharkey, D. W. Murray, S. Vandevelde*, I. D. Reid and P. F. Mclauchlan, “A Modular Head/Eye Platform for Real-Time Reactive Vision,” Mechatrontcs Vol. 3., No. 4. , pp. 517-535, 1993
[18]. S. W. Shih, Y. P. Hung, W. S. Lin, “Calibration of an Active Binocular Head,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 28, No. 4, pp.426-442, 1998.
[19]. S. W. Shih, Y. P. Hung, W. S. Lin, “Head/Eye Calibration of a Binocular by Use of Single Calibration Point,” Image Analysis and Interpretation, Proceeding of the IEEE Southwest Symposium, pp.154-159, 1994.
[20]. R. Y. Tsai, R. K. Lenz, “A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration”, IEEE Transactions on robotics and automation, Vol.5, No.3, pp.345-358, 1989.
[21]. R. Y. Tsai, “An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision”, International Conference on Computer Vision and Pattern Recognition, USA, pp.364-374, 1986.
[22]. C. M. Wang and Z. Chen, “Camera Parameter Determination from A Single View of A General Planar Calibration Object ,” International Conference on Computer Vision and Pattern Recognition, VOL.1, pp.238 –242, 1994.
[23]. M. Wessler, “A Modular Visual Tracking System,” M. S. Thesis, Department of Electrical Engineering and Computer Science, MIT, June 1995.
[24]. W. G. Yau, Li-Chen Fu, David Liu, “Design and Implementation of Visual Servoing System for Realistic Air Target Tracking,” International Conference on Robotic & Automation, pp.229-234, 2001.
[25]. Z. Zhang, “A Flexible New Technique for Camera Calibration,” Technical Report MSR-TR-98-71, Update on Microsoft Corporation, pp.1-21, 1998.
[26]. ”Daytona Dual ‘C6x PCI Board Technical Reference,” Technical Report, Spectrum corp., 1999
[27]. “DSP~LINK3 Interface Specification,” Technical Report, Spectrum Corp., 1997.