| 研究生: |
賴昌宏 Lai, Chang-Hong |
|---|---|
| 論文名稱: |
類人形機器人之設計與控制 Design and Control of an Anthropomorphic Robot |
| 指導教授: |
蔡清元
Tsay, Tsing-Iuan |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 機械工程學系 Department of Mechanical Engineering |
| 論文出版年: | 2003 |
| 畢業學年度: | 91 |
| 語文別: | 中文 |
| 論文頁數: | 104 |
| 中文關鍵詞: | 看而後動控制架構 、輪式移動平台 、雙眼機械頭 、類人形機器人 、人形機器人 |
| 外文關鍵詞: | look-and-move control structure, robotic binocular head, wheeled mobile base, humanoid robot, anthropomorphic robot |
| 相關次數: | 點閱:96 下載:6 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
中文摘要
近十年來,由於電子及控制技術的進步,機器人不僅是設計來執行危險或自動化的工作,也能與人類進行友善的接觸;因此,機器人的功用已逐漸地由工廠自動化,轉變為與人類間之互動,例如:人們非常期待,機器人可於家庭、辦公室和醫院等日常生活環境中,協助人類。尤其,由於人形機器人擁有擬人的外形、具親和力的設計及在人類生活環境中的智能等要素,使得人形機器人成為眾所期盼的對象,而為滿足消費者的需求,近幾年來,已開發出各式各樣的人形機器人。
本研究建構了一個類人形機器人,以作為在人形機器人領域的一個先期研究。所開發之類人形機器人主要由一輪式移動平台、一個架於輪式移動平台上的固定式身軀、一個七自由度的機械手臂、一個七自由度的機械手掌及一個五自由度的雙眼機械頭所組成。並提出類人形機器人協調式視覺導引控制架構,此控制架構可分為兩個主要部分:一、以影像為基礎看而後動之控制架構;其整合了用以控制雙眼機械頭進行搜尋及鎖定的二個控制策略,二、以位置為基礎看而後動之控制架構;在雙眼機械頭的協助下,其整合了用以驅動機械手臂抓取目標物體的二個控制策略。而所提出的影像處理法則,於自然環境中,能增加快速偵測目標物體的感知能力,因而可減少影像處理的計算負擔。
最後,以三組實驗來驗證理論的推導與類人形機器人的性能。實驗進行時,將平行六面長方體、具握把的杯子及具方形底的瓶狀容器等三個物體放置於平台上,而在每一組實驗中,分別選擇其中一個作為目標物體。實驗結果顯示,類人形機器人的機械頭及機械手臂,能互相協調來決定目標物體的所在,並成功地抓取該物體。
Abstract
With advances in electronics and control technology, robot manipulators have, in the recent decade, been designed not only to perform hazardous or automated tasks, but also to interact with humans in a friendly manner. The function of robotic systems has thus increasingly shifted from industrial automation to human interaction. For instance, robots that assist humans in daily environments such as in offices, homes and hospitals are highly desired. In particular, humanoid robots are widely anticipated to emerge owing to factors such as anthropomorphism, friendly design, and intelligence within human living environments. To fulfill these consumer demands, several humanoid robots have been developed in recent years.
This work constructs an anthropomorphic robot as a preliminary research of humanoid robots. The anthropomorphic robot developed herein comprises mainly a wheeled mobile base, a fixed torso mounted on a mobile base, an 7 D.O.F. robot arm, an 7 D.O.F. robot hand and an 5 D.O.F. robotic binocular head. A coordinated visually guided control structure is then proposed for this anthropomorphic robot. The control structure can be divided into two major parts: (a) the image-based look-and-move control structure integrated with two control strategies used to control the saccade and fixation of the robotic binocular head and (b) the position-based look-and-move control structure integrated with two control strategies employed to control the robot arm to retrieve a target object, using the coordinated robotic head. Image processing algorithms are also proposed to increase the perception capability of rapidly detecting the location of a target object in a natural environment, thus reducing the computational burden for image processing.
Finally, three sets of experiments are conducted to verify the theoretical derivations and the performance of the anthropomorphic robot. Three objects, i.e., a rectangular parallelepiped, a cup with a handle and a bottle-shaded container with square bottom, are placed on a table, with one selected as the target object in each set of experiments. Experimental results indicate that the robotic head and the robot arm of the anthropomorphic robot can coordinate with each other to locate and grasp the target object successfully.
參考文獻
[1]. A. Abbott, “Dynamic Integration of Depth Cues for Surface Reconstruction from Stereo Images”, Ph.D. Thesis, Department of Electrical Engineering, Univ. of Illinois at Urban-Champaign, 1990.
[2]. Y. I. Abdel-Aziz and H. M. Karara, “Direct Linear Transformation into Object Space Coordinate in Close-Range Photogrammetry”, ASP Symposium on Close-Range Photogrammetry, USA, pp. 1-18, 1971.
[3]. R. L. Andersson, “Computing the feasible configurations of a 7-DOF arm subject to joint limits”, IEEE Transactions on Robotics and Automation, vol. 9,No. 2, pp. 186-195, 1993.
[4]. G. S. Bell, C. C. Williams Hulls and W. J. Wilson, ”Relative End-Effector Control Using Cartesian Position Based Visual Servoing”, IEEE Transactions on Robotics and Automation, Vol.12 , No.5, pp. 684-696 , Oct.1996.
[5]. F. Chaumette and Y. Mezouar, “Path Planning in Image Space for Robust Visual Servoing”, Proceedings of IEEE International Conference on Robotics and Automation, Vol.3, pp. 2759-2764, 2000.
[6]. F. Chaumette, “Potential Problems of Stability and Convergence in Image-Based and Position-Based Visual Servoing”, Lecture Notes in Control and Information Sciences, Vol.237, pp. 66-78, 1998.
[7]. J. M. Chiu, Z. Chen and C. M. Wang, “3-D Polyhedral Face Computation from Two Perspective Views with the Aid of a Calibration Plate”, IEEE Transactions on robotics and automation, VOL. 13, No. 2, pp. 290-295, 1997.
[8]. N. J. Ferrier, “Trajectory Control of Active Vision Systems”, Ph.D. Thesis, Harvard University, 1992.
[9]. K. Hirai, M. Hirose, Y. Haikawa and T. Takenaka, “The Development of Honda Humanoid Robot”, Proceedings of IEEE International Conference on Robotics & Automation, pp.1321-1326 , May. 1998.
[10]. S. Hirano and K. Sato, “Development of an anthropomorphic Head-Eye System for a Humanoid Robot─Realization of Human-like Head-Eye Motion Using Eyelids Adjusting to Brightness”, Proceedings of IEEE International Conference on Robotics & Automation, pp. 1308-1314, 1998.
[11]. M. Hirose “Humanoid Robot”, In Journal of the Robotics Society of Japan, Vol.15 No.7, pp. 983-985, 1997.
[12]. Q. Huang, Y. Nakamura and T. Inamura, “Humanoids Walk with Feedforward Dynamic Pattern and Feedback Sensory Reflection”, IEEE International Conference on Robotics & Automation, pp.4220-4225 , May. 2001.
[13]. S. Hutchinson, G. D. Hager and P. I. Corke, “A Tutorial on Visual Servo Control”, IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, pp. 651-670, 1996.
[14]. M. Jenkin, E. Millios and J. Tsotsos, “TRISH: A Binocular Robot Head with Torsional Eye Movements”, Proceedings of SPIE, Application of AI X: Machine Vision and Robotics, Orlando, pp. 36-46, Apr. 1992.
[15]. R. Klette, K. Schluns and Andreaas Koschan, “Computer Vision: Three- Dimensional Data from Images”, 1st Edition, Springer-Verlag, 1998.
[16]. E. P. Krotkov, “Active Computer Vision by Cooperative Focus and Stereo”, Springer-Verlag, 1989.
[17]. S. H. Lee, S. Y. Lee and W. H. Seo, ”A Study On Real-Time Implementation of Visual Feedback Control of Robot Manipulator”, Systems, Proceedings of 1999 IEEE International Conference on Systems Man and Cybernetics, Vol.2, pp. 824 - 829, Oct. 1999.
[18]. R. X. Lin, “Development of a Mobile Robot for Vision Guided Material Handling” M. S. Thesis, Department of Mechanical Engineering, NCKU, July 2001.
[19]. L. W. Liu, “Development of a Dexterous Hand for Prosthetic System” M. S. Thesis, Department of Mechanical Engineering, NTU, July 2002.
[20]. B. C. Madden and U. C. Seelen, “PennEyes – A Binocular Active Vision System”, Technical Report MS-CIS-95-37, Univ. of Pennsylvania, Dec. 1995.
[21]. F. Mokhtarian and R. Suomela, “Robust Image Corner Detection Through Curvature Scale Space”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 20, No. 12, pp. 1376-1381, 1998
[22]. T. Morita, H. Iwata and S. Sugano, “Development of Human Symbiotic Robot: WENDY”, Proceedings of IEEE International Conference on Robotics & Automation, pp. 3183-3188, 1999.
[23]. T. Morita, K. Shibuya and S. Sugano, “Design and control of Mobile Manipulation System for Human Symbiotic Humanoid: Hadaly-2”, Proceedings of IEEE International Conference on Robotics & Automation, pp. 1315-1320, 1998.
[24]. T. Morita and S. Sugano, “Development and Evaluation of Seven-D.O.F. MIA ARM”, Proceedings of IEEE International Conference on Robotics & Automation, pp. 462-467, 1997.
[25]. K. Nishiwaki, T. Sugihara, S. Kagami, F. Kanehiro, M. Inaba and H. Inoue, "Design and Development of Research Platform for Perception-Action Integration in Humanoid Robot : H6", Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 3, pp. 1559-1564, 2000.
[26]. K. Pahlavan and J.-O. Eklundh, “A Head-Eye System – Analysis and Design”, Comput. Vision Graph. Image Process.: Image Understanding 56, 1, pp. 41-56, 1992.
[27]. K. Pahlavan, “The KTH Head-Eye System”, A Technical Reference, KTH, Stockholm, 1991.
[28]. D. V. Papadimitrion and T. J. Dennis, ” Epipolar Line Estimation and Rectification for Stereo Image Pairs”, IEEE Transaction on image processing, Vol 5, NO. 4 , pp. 672-676,1996
[29]. R. P. Paul, “Robot Manipulator: Mathematics, Programming and Control”, the MIT Press, Cambrzdge, MA. 1981.
[30]. A. C. Sanderson and L. E. Weiss, “Image-based Visual Servo Control Using Relational Graph Error Signals”, Proceeding of IEEE, pp. 1074-1077, 1980.
[31]. S. W. Shih, J. S. Jin, K. H. Wei and Y. P. Hung, “Kinematic Calibration of a Binocular Head Stereo Vision with the Complete and Parametrically Continuous Model”, Proceedings of SPIE Conference on Intelligent Robots and Computer Vision XI: Algorithms, Techniques and Active Vision, Boston, pp. 643-657, Nov. 1992.
[32]. R. Y. Tsai, “An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision”, Proceeding of International Conference on Computer Vision and Pattern Recognition, USA, pp. 364-374, 1986.
[33]. K. J. Waldron, “Mechanical Design of Manipulators and Robots: Notes on Modelling of 3 Dimensional Ariculated System”, The Ohio State University, January 1984.
[34]. C. M. Wang and Z. Chen, “Camera Parameter Determination from A Single View of A General Planar Calibration Object “, Proceeding of International Conference on Computer Vision and Pattern Recognition, VOL.1, pp. 238 –242, 1994.
[35]. M. M. Williamson , “Rhythmic Robot Arm Control Using Oscillators”, Proceedings of IROS '98, 1998.
[36]. Z. Zhang, “A Flexible New Technique for Camera Calibration”, Technical Report MSR-TR-98-71, 1998.
[37]. 張文中、張文城,“Direct Visual Servoing with Image-Based Task Encoding”, 控制與自動化組論文集,中國機械工程學會第十六屆全國學術研討會,Dec. 1999。
[38]. 蔡清元、李國駿,“雙眼機械頭之設計及其在視覺追蹤之應用”,控制與自動化組論文集,中國機械工程學會第十五屆全國學術研討會,pp. 453-460,Nov. 1998。