研究生: |
張維哲 Chang, Wei-Che |
---|---|
論文名稱: |
基於視覺伺服之未知物體輪廓循跡控制研究 Study on Visual Servoing Based Contour Following Control of Objects with Unknown Geometric Models |
指導教授: |
鄭銘揚
Cheng, Ming-Yang |
學位類別: |
碩士 Master |
系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
論文出版年: | 2013 |
畢業學年度: | 101 |
語文別: | 中文 |
論文頁數: | 90 |
中文關鍵詞: | 視覺伺服控制 、循跡控制 、深度估測 、參數式曲線 |
外文關鍵詞: | Visual Servoing Control, Contour Following Control, Depth Estimation, Parametric Curve |
相關次數: | 點閱:102 下載:7 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
近年來機械手臂廣泛應用於各式工業加工製程,如:焊接、拋光、去毛邊以及噴漆等,而此類加工精度大多取決於循跡控制性能之優劣。然而在目前機械手臂之循跡應用中,其循跡之軌跡大都仰賴人工繪圖或由機械手臂教導器進行事先教導,不僅耗時且精度上有一定的限制。再者,機械手臂之控制性能將受限於事前之機構校正精度。有鑑於此,本論文引入視覺伺服技術於循跡控制問題,以架設於上方之攝影機對待加工物體進行拍攝,並產生參數式曲線來描述物體輪廓做為位置命令。根據此參數式曲線,本論文發展一命令插值器以產生影像平面上之軌跡命令。接著將此軌跡命令轉換至架設於側邊之攝影機影像平面實現以影像為基礎之視覺伺服(IBVS),以確保機械手臂之末端點可準確沿著物體輪廓運動。然而在IBVS應用中,其Image Jacobain需有準確的末端點深度量資訊,因此本論文提出基於參數式曲線之深度估測法以提供精準的深度資訊,進而提昇視覺伺服控制之性能。最後,本論文採用Dynamic Look and Move設計視覺迴路控制器外亦自行設計機械手臂之內迴路速度控制器,其中並引入基於參數式曲線之前饋補償以降低系統之追蹤誤差。實驗結果顯示由本論文所發展之視覺伺服架構,針對未知數學模型之輪廓循跡控制問題,確實展現令人滿意之性能。
In recent years, robot manipulators have been used extensively in industrial manufacturing processes such as welding, polishing, deburring and spraying. The precision of these applications depends on the performance of contour following control. Normally, trajectory generation in contour following tasks relies on manual drawing or trained in advance by a robot teaching panel, which is time-consuming and of limited accuracy. Furthermore, control performance of the robot manipulator is also restricted by the accuracy of prior mechanical calibration. As a consequence, this thesis integrates visual servoing into contour following control. The image of the object for machining is taken by a ceiling-mounted camera. The detected object contour is described by a parametric curve. Based on this parametric curve, an interpolator is developed to generate the trajectory command in the image plane. Secondly, this trajectory command is converted to the position command of the image plane of the camera that is located on the side of the object for machining to implement IBVS so as to ensure that the end-point of the robot arm can accurately move along the object contour. However, the exact end-point depth information of the Image Jacobian is necessary in IBVS, therefore this thesis proposes the parametric curve based depth estimation method to enhance visual servoing performance with accurate depth information. Finally, in addition to designing the visual loop controller, the inner loop velocity controller including parametric curve based feedforward compensation is designed in this thesis to reduce the tracking error as well. Experimental results indicate that the proposed approach exhibits satisfactory performance in contour following tasks of the objects without known mathematical models.
[1] http://www.chengchia.com.tw/taiwan/iframe-factory-6.html.
[2] http://big5.ifeng.com/gate/big5/blog.ifeng.com/article/21027796.html.
[3] S. Hutchinson, G. D. Hager, and P. I. Corke, "A tutorial on visual servo control," IEEE Transactions on Robotics and Automation, vol. 12, pp. 651-670, Oct 1996.
[4] F. Chaumette and S. Hutchinson, "Visual servo control - Part I: Basic approaches," IEEE Transactions on Robotics and Automation, vol. 13, pp. 82-90, Dec 2006.
[5] J. Hill and W. T. Park, "Real time control of a robot with a mobile camera," in Proc. of the 9th ISIR, Mar. 1979, Washington, D. C., pp. 233-246.
[6] P. I. Corke and M. C. Good, "Dynamic effects in visual closed-loop systems," IEEE Transactions on Robotics and Automation, vol. 12, pp. 671-683, Oct 1996.
[7] P. I. Corke, "Dynamic issues in robot visual-servo systems," in Proc. of the Symp. Robotics Research, Herrsching, Germany, 1995, pp.488-498.
[8] F. Chaumette, “Potential problems of stability and convergence in image-based and position-based visual servoing,” in The Confluence of Vision and Control, vol. 237, Lecture Notes in Control and Information Sciences,D. Kriegman, G. Hager, and S. Morse, Eds. New York: Springer- Verlag, 1998, pp. 66–78.
[9] F. Chaumette and S. Hutchinson, "Visual servo control - Part II: Advanced approaches," IEEE Transactions on Robotics and Automation, vol. 14, pp. 109-118, Mar 2007.
[10] E. Malis and F. Chaumette, "2 1/2 D visual servoing with respect to unknown objects through a new estimation scheme of camera displacement," International Journal of Computer Vision, vol. 37, pp. 79-97, Jun 2000.
[11] F. Chaumette and E. Malis, "2 1/2 D visual servoing: a possible solution to improve image-based and position-based visual servoings," in Proc. of the IEEE International Conference on Robotics and Automation, 2000, pp. 630-635.
[12] E. Malis, F. Chaumette, and S. Boudet, "2½D visual servoing," IEEE Transactions on Robotics and Automation, vol. 15, pp. 238-250, 1999.
[13] P. I. Corke and S. A. Hutchinson, "A new partitioned approach to image-based visual servo control," IEEE Transactions on Robotics and Automation, vol. 17, pp. 507-515, 2001.
[14] P. I. Corke and S. A. Hutchinson, "A new hybrid image-based visual servo control scheme," in Proc. of the 39th IEEE Conference on Decision and Control, 2000, pp. 2521-2526.
[15] N. Guenard, T. Hamel, and R. Mahony, "A practical visual servo control for an unmanned aerial vehicle," IEEE Transactions on Robotics, vol. 24, pp. 331-340, 2008.
[16] T. Hamel and R. Mahony, "Image based visual servo control for a class of aerial robotic systems," Automatica, vol. 43, pp. 1975-1983, 2007.
[17] R. Mahony and T. Hamel, "Image-based visual servo control of aerial robotic systems using linear image features," IEEE Transactions on Robotics, vol. 21, pp. 227-239, 2005.
[18] T. Hamel, R. Mahony, and A. Chriette, "Visual servo trajectory tracking for a four rotor VTOL aerial vehicle," in Proc.of the IEEE International Conference on Robotics and Automation, 2002, pp. 2781-2786.
[19] J. Chen, W. E. Dixon, M. Dawson, and M. McIntyre, "Homography-based visual servo tracking control of a wheeled mobile robot," IEEE Transactions on Robotics, vol. 22, pp. 406-415, 2006.
[20] Y. Fang, W. E. Dixon, D. M. Dawson, and P. Chawda, "Homography-based visual servo regulation of mobile robots," IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 35, pp. 1041-1050, 2005.
[21] C. C. Cheah, S. P. Hou, Y. Zhao, and J.-J. Slotine, "Adaptive vision and force tracking control for robots with constraint uncertainty," IEEE/ASME Transactions on Mechatronics,vol. 15, pp. 389-399, 2010.
[22] V. Chari, A. Sharma, A. Namboodiri, and C. Jawahar, "Frequency Domain Visual Servoing using Planar Contours," in Proc. of the sixth Indian Conference on Computer Vision, Graphics & Image, 2008, pp. 87-94.
[23] S.-W. Jeon, D.-S. Ahn, H.-J. Bae, and C.-W. Hong, "Object contour following task based on integrated information of vision and force sensor," in Proc.of the International Conference on Control, Automation and Systems, 2007, pp. 1040-1045.
[24] W.-C. Chang, "Binocular vision-based 3-D trajectory following for autonomous robotic manipulation," Robotica, vol. 25, pp. 615-626, 2007.
[25] F. Nageotte, P. Zanne, C. Doignon, and M. de Mathelin, "Visual servoing-based endoscopic path following for robot-assisted laparoscopic surgery," in Proc. of the. IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006, pp. 2364-2369.
[26] A. C. Leite, F. Lizarralde, and L. Hsu, "Hybrid vision-force robot control for tasks on unknown smooth surfaces," in Proc.of the IEEE International Conference on Robotics and Automation, 2006, pp. 2244-2249.
[27] J. Pomares and F. Torres, "Movement-flow-based visual servoing and force control fusion for manipulation tasks in unstructured environments," IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 35, pp. 4-15, 2005.
[28] E. Malis, G. Chesi, and R. Cipolla, "2(1)/D-2 visual servoing with respect to planar contours having complex and unknown shapes," International Journal of Robotics Research, vol. 22, pp. 841-853, Oct-Nov 2003.
[29] F. Lange and G. Hirzinger, "Predictive visual tracking of lines by industrial robots," International Journal of Robotics Research, vol. 22, pp. 889-903, 2003.
[30] A. Krupa, J. Gangloff, C. Doignon, M. F. de Mathelin, G. Morel, J. Leroy, L. Soler, and J. Marescaux, "Autonomous 3-D positioning of surgical instruments in robotized laparoscopic surgery using visual servoing," IEEE Transactions on Robotics and Automation, vol. 19, pp. 842-853, 2003.
[31] J. A. Gangloff and M. F. de Mathelin, "Visual servoing of a 6-DOF manipulator for unknown 3-D profile following," IEEE Transactions on Robotics and Automation, vol. 18, pp. 511-520, Aug 2002.
[32] J. Baeten, W. Verdonck, H. Bruyninckx, and J. De Schutter, "Combining force control and visual servoing for planar contour following," Int. J. of Machine Intelligence and Robotic Control, vol. 2, pp. 69-75, 2000.
[33] G. Flandin, F. Chaumette, and E. Marchand, "Eye-in-hand/eye-to-hand cooperation for visual servoing," in Proc. of the IEEE International Conference on Robotics and Automation 2000, pp. 2741-2746.
[34] R. Hartley and A. Zisserman, Multiple view geometry in computer vision, Cambridge Univ Press, 2000.
[35] M. W. Spong, S. Hutchinson, and M. Vidyasagar, Robot modeling and control, John Wiley & Sons, New York, 2006.
[36] L. Quan and Z. Lan, "Linear n-point camera pose determination," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 21, pp. 774-780, 1999.
[37] R. Horaud, B. Conio, O. Leboulleux, and B. Lacolle, "An analytic solution for the perspective 4-point problem," Computer Vision, Graphics, and Image Processing, vol. 47, pp. 33-44, 1989.
[38] A. De Luca, G. Oriolo, and P. R. Giordano, "On-line estimation of feature depth for image-based visual servoing schemes," in Proc.of the IEEE International Conference on Robotics and Automation, 2007, pp. 2823-2828.
[39] L. Matthies, T. Kanade, and R. Szeliski, "Kalman filter-based algorithms for estimating depth from image sequences," International Journal of Computer Vision, vol. 3, pp. 209-238, 1989.
[40] M. Kass, A. Witkin, and D. Terzopoulos, "Snakes: Active contour models," International Journal of Computer Vision, vol. 1, pp. 321-331, 1988.
[41] 謝尚勳,,基於視覺之 PH 雲形線平面運動軌跡插值器之研究,碩士論文,國立成功大學電機工程學系,2010.
[42] F. Pelosi, M. L. Sampoli, R. T. Farouki, and C. Manni, "A control polygon scheme for design of planar C2 PH quintic spline curves," Computer Aided Geometric Design, vol. 24, pp. 28-52, 2007.
[43] R. T. Farouki, B. K. Kuspa, C. Manni, and A. Sestini, "Efficient solution of the complex quadratic tridiagonal system for C2 PH quintic splines," Numerical Algorithms, vol. 27, pp. 35-60, 2001.
[44] R. C. Gonzalez, R. E. Woods, and S. L. Eddins, Digital image processing using MATLAB, Gatesmark Publishing Tennessee, 2009.
[45] http://www.blackice.com/colorspaceHSI.htm.
[46] http://www.sourceforge.net/projects/opencvlibrary.
[47] Z. Zhang, "A flexible new technique for camera calibration," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, pp. 1330-1334, 2000.
[48] L. N. Trefethen and D. Bau III, Numerical linear algebra, Society for Industrial and Applied Mathematics, 1997.
[49] L. Sciavicco and B. Siciliano, Modelling and control of robot manipulators, Springer Verlag, 2000.
[50] Y. Mezouar and F. Chaumette, "Path planning for robust image-based control," IEEE Transactions on Robotics and Automation, vol. 18, pp. 534-549, 2002.
[51] R. H. Brown, S. C. Schneider, and M. G. Mulligan, "Analysis of algorithms for velocity estimation from discrete position versus time data," IEEE Transactions on Industrial Electronics, vol. 39, pp. 11-19, 1992.