簡易檢索 / 詳目顯示

研究生: 張慶榮
Chang, Ching-Jung
論文名稱: 具未經校準眼在手視覺系統之移動式機械手臂之姿態控制
Pose Control of Mobile Manipulator with an Uncalibrated Eye-in-Hand Vision System
指導教授: 蔡清元
Tsay, Tsing-Iuan
學位類別: 碩士
Master
系所名稱: 工學院 - 機械工程學系
Department of Mechanical Engineering
論文出版年: 2003
畢業學年度: 91
語文別: 英文
論文頁數: 75
中文關鍵詞: 移動式機械手臂視覺導引無人搬運車移動型平台眼在手
外文關鍵詞: Mobile Manipulator, Vision-Guided, Uncalibrated, AGV, Mobile Base, Eye-in-Hand
相關次數: 點閱:122下載:10
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 移動式機械手臂乃由一座移動平台、一部機械手臂及一套視覺系統所組成,其非常適合在少量多樣之生產線上,進行物料的搬運。如此具彈性的搬運系統不只可以節省人力,尚且符合一些工作所須之潔淨度的要求,因此,在半導體廠及液晶顯示器面板廠的生產線上,其被廣泛地使用於物料的搬運。
    移動式機械手臂在工作站之間進行物料搬運時,機械手臂主要是藉由移動平台的導航控制系統將其驅動至工作站,但由於地面的不平坦,以及移動平台的定位誤差,所以,於移動式機械手臂到站後,移動平台與工作站間,有著不可避免的位置與方位誤差。因此,本研究利用一未經校準之眼在手視覺系統,來導引移動平台上之機械手臂,抓取置放在工作台上的工件,並提出視覺導引控制策略,其主要是依據所選取的影像特徵及以影像為基礎之看而後動的控制架構。在控制架構中所使用之控制法則,主要是利用最小平方近似法離線估測的影像特徵空間與卡式座標空間之轉換關係。而在影像處理方面,提出不需特殊打光下,快速求解出物體的轉角及形心之演算法。
    在論文最後,以控制機械手臂之端接器,接近並抓取置於工作站上不同所在之工件,進行實驗來評估眼在手之機械手臂之定位性能。實驗結果顯示,所提出之視覺導引控制策略,能確保移動式機械手臂可以直接行駛於任二工作台間,而不需特別於中間停下來修正移動平台之定位誤差,並於到站後,可在不平坦的地面及不使用特殊打光之條件下,執行取放任務。

    Mobile manipulators, which consist of a mobile base and a robot manipulator equipped with a vision system, are appropriate for transferring small quantities of a range of different materials in production lines. Such a flexible material transfer system not only saves human resources, but also meets the requirements of cleanliness associated with some tasks. Consequently, they are extensively used to transport material in semiconductor or LCD panel production lines.
    When transferring material from one station to another, a mobile manipulator is controlled to reach a station by the guidance control system of the mobile base. Position and orientation errors of the mobile base relative to the station are inevitably caused by the non-horizontality of the ground and positioning errors of the mobile base. Hence, this study utilizes an uncalibrated eye-in-hand vision system to provide visual information for controlling the manipulator mounted on the mobile base, to pick up a workpiece located on the station. A vision-guided control strategy is proposed. It is based on selected image features and an image-based look-and-move control structure. The control law employed in the control structure is based mainly on the off-line estimate of the trasformation from feature space to Cartesian space using the least squares estimation algorithm. Image processing algorithms are also proposed to increase the perception capability of rapidly detecting the corners and centroid of a silhouette captured by the vision system without any special lighting.
    Finally, the positioning performance of the eye-in-hand manipulator is experimentally evaluated by controlling the end-effector of the manipulator to approach and grasp the workpiece in various locations on a station. The experimental results reveal that the proposed vision-guided control strategy ensures that the mobile manipulator can travel from one station to another without in-between stops and can perform pick-and-place operations on a non-planar ground without any special lighting.

    Abstract (in Chinese) i Abstract (in English) ii Acknowledgements iii Table of Contents iv List of Tables vi List of Figures vii 1 Introduction 1 1.1 Preface 1 1.2 Literature Survey 2 1.3 Motivation and Objective 4 1.4 Organization of Thesis 4 2 Background 5 2.1 Mobile Base 5 2.1.1 Mechanical Structure 5 2.1.2 Sensory Devices 6 2.1.3 Guidance Control System 7 2.2 Robot Manipulator 9 2.3 Vision System 9 2.4 Hardware Architecture of the Constructed Mobile Manipulator 10 3 Image Processing 18 3.1 Preprocessing Images 18 3.2 Identifying the Center of Gravity and the Area of the Quadrangle 19 3.3 Finding the Principal Angle of the Quadrangle 19 3.4 Determining the Corners of the Quadrangle 20 3.4.1 Edge Detection 20 3.4.2 Least-Squares Line Fitting 20 3.4.3 Determination of the Corners of the Quadrangle 22 4 Relative End-effector Control using Image-Based Look-and-Move Structure 26 4.1 Transformation from Feature Space to Cartesian Space 26 4.2 Selection of Image Features 27 4.3 Control Strategy for Approaching the Target 29 4.3.1 Trajectory Planning in Feature Space 29 4.3.2 Eight-Path Off-line Task Required by the LSE Algorithm to Approximate the Function 31 4.3.3 Control Strategy 34 5 Experimentation 42 5.1 Experimental Setup 42 5.2 Eight-Path Off-Line Task 43 5.3 Positioning Performance of the Eye-in-Hand Manipulator 46 5.4 Discussion 50 6 Conclusion 69 6.1 Summary 69 6.2 Future Improvements 70 Bibliography 72 Autobiography 75

    [1]. J. Baeten, H. Bruyninckx and J. D. Schutter, “Tool/Camera Configurations for Eye-in-hand Hybrid Vision/Force Control,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, pp.1704-1709, 2002.
    [2]. W.-C. Chang, J. P. Hespanha, A. S. Morse and G. D. Hager, “Task Re-encoding in Vision-based Control Systems,” Proceedings of IEEE Conference on Decision and Control, Vol.1, pp.48-53, 1997.
    [3]. P. Corke and M. Good, “Dynamic Effects in Visual Closed-loop Systems,” IEEE Transactions on Robotics and Automation, Vol.12, No.5, pp.651-670, Oct.1996.
    [4]. J. T. Feddema and O. R. Mitchell, “Vision-guided Servoing with Feature-based Trajectory Generation,” IEEE Transactions on Robotics and Automation Vol. 5, pp. 691 –700, Oct. 1989.
    [5]. K. Hashimoto, “Visual Servoing : Real-Time Control of Robot Manipulators Based on Visual Sensory Feedback,” World Scientific, 1993.
    [6]. M. H. Hsu, “Pose Control of Mobile Robots for Vision-Guided Material Grasping,” M. S. Thesis, Department of Mechanical Engineering, National Cheng Kung University, R.O.C., 2002.
    [7]. S. Huchinson, G. D. Hager and P. I. Corke, “A Tutorial on Visual Servo Control,” IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, pp.651-670, 1996.
    [8]. W. Jang and Z. Bien, “Feature-based Visual Servoing of an Eye-in-hand Robot with Improved Tracking Performance,” Proceedings of IEEE International Conference on Robotics and Automations, pp. 2254-2260 1991.
    [9]. B. Kosko, “Neural Networks and Fuzzy systems,” Prentice Hall International Editions, 1992.
    [10]. S. H. Lee, S. Y. Lee and W. H. Seo, “A Study on Real-time Implementation of Visual Feedback Control of Robot Manipulator,” Proceedings of IEEE international Conference on Systems Man and Cybernetics, Vol.2, pp.824-829, Oct. 1999.
    [11]. R. X. Lin, “Development of a Mobile Robot for Vision Guided Material Handling,” M. S. Thesis, Department of Mechanical Engineering, National Cheng Kung University, R.O.C., 2001.
    [12]. A. Matsikis, T. Zoumpoulidis, F. H. Broicher and K. F. Kraiss, “Learning Object-specific Vision-based Manipulator in Virtual Environments,” Proceedings of the 2002 IEEE. Robot and Human Interactive Communication, pp.204-210, 2002.
    [13]. A. C. Sanderson and L. E. Weiss. “Image Based Visual Servo Control of Robots,” 26th Annual SPIE Technical Symposium, SPIE, 1982.
    [14]. A. C. Sanderson and L. E. Weiss. “Image-based visual servo control using relational graph error signals,” Proceedings of IEEE, pp. 1074-1077, 1980.
    [15]. I. H. Suh and T. W. Kim, “A Visual Servoing Algorithm using Fuzzy Logics and Fuzzy-Neural Networks,” Proceedings of the 1996 IEEE International Conference on Robotics and Automation, pp.3605-3612, 1996.
    [16]. I. H. Suh and T. W. Kim, “Fuzzy Membership Function Based Neural Network with Applications to the Visual Servoing of Robot Manipulators,” IEEE transactions Fuzzy Systems, Vol.2, No.3, pp203-220, Aug. 1994.
    [17]. Y. Ting, Y. H. Chen, M. Lin, S. C. Dai and Y Kang, 1997, "A Study on the Inaccuracy of Vision System of Mobile Robots Causing the Failure of Pick-and-place Tasks," Proceedings of the 1997 Florida Conference on Recent Advances in Robotics, Miami, April, pp. 43-46,1997.
    [18]. T. Ueyama, S. Takeda, S. Hashimoto, H. Terada and H. Kotani, “Decentralized Autonomous Mobile Robots for Adaptive Production System,” Proceedings of the 9th International Conference on Advanced Robotics (’99 ICAR), pp.211-216, Tokyo, Japan, Oct. 25-27, 1999.
    [19]. G. L. Wang, “Hand-Eye Coordination of the Robotic Binocular Head and Manipulator,” M. S. Thesis, Department of Mechanical Engineering, National Cheng Kung University, R.O.C., 2002.
    [20]. Z. Zhang, “A Flexible New Technique for Camera Calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Volume 22 No. 11, pp.1330-1334, Nov. 2000.

    下載圖示 校內:2009-08-25公開
    校外:2009-08-25公開
    QR CODE