| 研究生: |
張庭育 Chang, Ting-Yu |
|---|---|
| 論文名稱: |
虛擬視覺伺服估測器及動態視覺伺服架構之研究 Study on Virtual Visual Servoing Estimator and Dynamic Visual Servoing Scheme |
| 指導教授: |
鄭銘揚
Cheng, Ming-Yang |
| 學位類別: |
碩士 Master |
| 系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
| 論文出版年: | 2018 |
| 畢業學年度: | 106 |
| 語文別: | 中文 |
| 論文頁數: | 108 |
| 中文關鍵詞: | 基於影像之視覺伺服 、關節型機械手臂 、計算力矩控制 、動態視覺伺服 、虛擬視覺伺服 、機器人教導 |
| 外文關鍵詞: | Image-based Visual Servoing, Industrial Manipulators, Computed Torque Control, Dynamic Visual servoing, Virtual Visual Servo, Teaching of Robot Arm |
| 相關次數: | 點閱:66 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
視覺伺服在廣泛的應用中都有其優勢,但影像雜訊、攝影機校正誤差、手眼校正誤差皆可能影響其控制性能,且在一般的情況下,視覺迴路的取樣頻率會遠低於控制迴路的取樣頻率,因此如何在不增加硬體成本的情況下,增加視覺伺服控制架構的效率、強健性及響應速度是很重要的研究課題。不過在多數的研究中,皆是討論如何藉由不同的特徵選取方式來取得一性質更良好的交互作用矩陣,或是藉由離線抑或是線上的軌跡規劃以期達到更有效率的控制,僅有少數文獻將機器人的系統動態一併加入考量、補償以改善視覺伺服的控制性能。有鑑於此,本論文將針對動態視覺伺服架構、估測器及控制器進行討論及改良,實驗使用之平台為六軸與二軸機械手臂。
Visual servoing has advantages in a wide range of applications. However, image noise, camera calibration errors, and hand-eye calibration errors can all affect its control performance. In general, the sampling rate of the visual loop will be much lower than the sampling rate of the servo control loop. Therefore, increasing the efficiency, robustness, and response speed of the visual servoing scheme without increasing the cost of hardware is an important research topic. However, most existing studies have investigated how to obtain a better interaction matrix through different feature selection methods or through trajectory planning to improve control performance. Only a few papers have taken into account the robot system dynamics and further compensate for it so as to enhance the performance of the visual servoing scheme. Therefore, this thesis focuses primarily upon the dynamic visual servoing scheme, estimator, controller and their improvements. The experimentally used platforms are six-axis and two-axis robot manipulators.
[1] Wikipedia. Industry 4.0 - Wikipedia. Retrieved May. 5, 2018, from https://en
.wikipedia.org/wiki/Industry_4.0
[2] 張維哲,基於視覺伺服之未知物體輪廓循跡控制研究,碩士論文,國立成功大學,電機工程學系,台灣,2013。
[3] 莊閔皓,六軸工業用機械手臂之系統鑑別與順應控制研究,碩士論文,國立成功大學,電機工程學系,台灣,2016。
[4] C. Cai, N. Somani, and A. Knoll, “Orthogonal image features for visual servoing of a 6-DOF manipulator with uncalibrated stereo cameras,” IEEE Transactions on Robotics, vol. 32, pp.452-461, Apr. 2016.
[5] J. Hill and W. T. Park, “Real time control of a robot with a mobile camera,” in Proc. of the 9th ISIR, Washington, U.S.A., Mar. 1979, pp. 233 -246.
[6] S. Hutchinson, G. D. Hager, and P. I. Corke, “A tutorial on visual servo control,” IEEE Transactions on Robotics and Automation, vol. 12, pp. 651-670, Oct. 1996.
[7] A. C. Sanderson and L. E. Weiss, “Adaptive visual servo control of robots,” in Robot vision, Berlin, Heidelberg, Springer, 1983.
[8] W. J. Wilson, C. C. Williams Hulls, and G. S. Bell, “Relative end-effector control using Cartesian position based visual servoing,” IEEE Transactions on Robotics and Automation, vol.12, pp.684-696, Oct. 1996.
[9] F. Chaumette, “Potential problems of stability and convergence in image-based and position-based visual servoing,” Lecture Notes in Control and Information Sciences, Springer, German, vol. 237, pp. 66-78. 1998.
[10] F. Chaumette and S. Hutchinson, “Visual servo control part I basic approaches,” IEEE Robotics & Automation Magazine, vol.13, pp. 82-90. Dec. 2006.
[11] E. Malis, F. Chaumette, and S. Boudet, “2½D visual servoing,” IEEE Transactions on Robotics and Automation, vol. 15, pp. 238-250, Apr. 1999.
[12] F. Chaumette and E. Malis, “2½D visual servoing : a possible solution to improve image-based and position-based visual servoings,” in Proc. of the IEEE Conf. on Robot. Autom., Apr. 2000, pp. 630-635.
[13] E. Malis and F. Chaumette, “2½D visual servoing with respect to unknown objects through a new estimation scheme of camera displacement,” International Journal of Computer Vision, vol. 37, pp.79-97, Jun 2000.
[14] P. I. Croke and S. A. Hutchinson, “A new hybrid image-based visual servo control scheme,” in Proc. IEEE Conf. Decision and Control, Sydney, NSW, Dec. 2000, pp. 2521-2526.
[15] N. Gans and S. Hutchinson, “An asymptotically stable switched system visual controller for eye in hand robots,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots Systems, Las Vegas, NV, Oct. 2003, pp. 735-742.
[16] F. Chaumette and S. Hutchinson, “Visual Servo Control, Part II: Advanced Approaches,” IEEE Robotics and Automation Magazine, vol.14, pp.109-118, Mar 2007.
[17] P. I. Corke and M. C. Good, “Dynamic effects in visual closed-loop systems,” IEEE Transactions on Robotics and Automation, vol. 12, pp.671-683. Oct. 1996.
[18] P. I. Corke, “Dynamic issues in robot visual-servo systems,” in Proc. of the Symp. Robotics Research, Herrsching, Germany, 1995, pp.488-498.
[19] O. Tahri and F. Chaumette, “Point-based and region-based image moments for visual servoing of planar objects,” IEEE Transactions on Robotics, vol.21, pp.1116-1127. Dec. 2005.
[20] D. Xu, J. Lu, P. Wang, Z. Zhang, and Z. Liang, “Partially decoupled image-based visual servoing using different sensitive features,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol.47, pp.2233-2243. Jan. 2017.
[21] R. Dahmouche, N. Andreff, Y. Mezouar, O. Ait-Aider, and P. Martinet, “Dynamic visual servoing from sequential regions of interest acquisition,” The International Journal of Robotics Research, vol.31, pp.520-537. Feb. 2012.
[22] M. Keshmiri, W. F. Xie, and A. Mohebbi, “Augmented image-based visual servoing of a manipulator using acceleration command,” IEEE Transactions on Industrial Electronics, vol.61, pp.5444-5452. Oct. 2014.
[23] F. Chaumette, S. Boukir, P. Bouthemy, and D. Juvin, “Structure from controlled motion,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 18, no. 5, pp. 492–504, 1996.
[24] A. De Luca, G. Oriolo, and P. R. Giordano, “Feature depth observation for image-based visual servoing: Theory and experiments,” Int. J. Robot. Res., vol. 27, no. 10, pp. 1093–1116, 2008.
[25] G. Chesi and Y. S. Hung, “Global path-planning for constrained and optimal visual servoing,” IEEE Transactions on Robotics, vol.23, pp.1050-1060. Oct. 2007.
[26] M. Keshmiri and W. F. Xie, “Image-based visual servoing using an optimized trajectory planning technique,” IEEE/ASME Transactions on Mechatronics, vol.22, pp.359-370. Aug. 2016.
[27] J. Armstrong Piepmeier, G.V. McMurray, and H. Lipkin, “A dynamic quasi-Newton method for uncalibrated visual servoing,” in Proc. of the 1999 IEEE International Conference on Robotics and Automation, Detroit, Michigan, May. 1999.
[28] J.A. Piepmeier, G.V. McMurray, and H. Lipkin, “Uncalibrated dynamic visual servoing,” IEEE Transactions on Robotics and Automation, vol.20, pp.143-147. Feb. 2004.
[29] Y. H. Liu, H. Wang, C. Wang, and K. K. Lam, “Uncalibrated visual servoing of robots using a depth-independent interaction matrix,” IEEE Transactions on Robotics, vol.22, pp.804-817. Aug. 2006.
[30] P. J. Sequeira Goncalves, L. F. Mendonca, J. M. C. Sousa, and J. R. Caldas Pinto, “Uncalibrated eye-to-hand visual servoing using inverse fuzzy models,” IEEE Transactions on Fuzzy Systems, vol.16, pp.341-353. Apr. 2008.
[31] I. Siradjuddin, L. Behera, T. Martin McGinnity, and S. Coleman, “Image-based visual servoing of a 7-DOF robot manipulator using an adaptive distributed fuzzy PD controller,” IEEE/ASME Transactions on Mechatronics, vol.19, pp.512-523. Apr. 2014.
[32] P. Jiang, Leon C. A. Bamforth, Z. Feng, John E. F. Baruch, and Y. Q. Chen, “Indirect iterative learning control for a discrete visual servo without a camera-robot model,” IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics, vol.37, pp.863-876. Aug. 2007.
[33] T. Yüksel, “Intelligent visual servoing with extreme learning machine and fuzzy logic,” Expert Systems With Applications, vol.72, pp.344-356. Apr. 2017.
[34] G. B. Huang, Q. Y. Zhu, and C. K. Siew, “Extreme learning machine: Theory and applications,” Neurocomputing, vol.70, pp.489-501. Dec. 2006.
[35] P. Jiang and R. Unbehauen, “Robot visual servoing with iterative learning control,” IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, vol.32, pp.281-287. Mar. 2002.
[36] Y. Wang, H. Lang, and C. W. de Silva, “A hybrid visual servo controller for robust grasping by wheeled mobile robots,” IEEE/ASME Transactions on Mechatronics, vol.15, pp.757-769. Oct.2010.
[37] 賴彥均,應用Q-learning於搭載機械手臂自走車系統之基於影像視覺伺服研究,碩士論文,電機工程學系,台灣,2017。
[38] H. Shi, X. Li, K. S. Hwang, W. Pan, and G. Xu, “Decoupled visual servoing with fuzzy Q-learning,” IEEE Transactions on Industrial Informatics, vol.14, pp.241-252. Jan. 2018.
[39] Wikipedia. Reinforcement learning - Wikipedia. Retrieved May. 5, 2018, from https://en.wikipedia.org/wiki/Reinforcement_learning
[40] K. Watanabe, Y. Iwatani, and K. Hashimoto, “Image-based visual PID control of a micro helicopter using a stationary camera,” Intelligent & Robotic Systems, vol. 78, no.2-3, pp. 381-393, Jan. 2008.
[41] T. Yuksel, “IBVS with fuzzy sliding mode for robot manipulators,” in Proc. of the Int. Workshop Recent Advances in Sliding Modes, Istanbul, 2015, pp.1-6.
[42] 李哲良,基於影像之視覺伺服應用於循跡控制研究,碩士論文,國立成功大學,電機工程學系,台灣,2016。
[43] Janabi-Sharifi, Farrokh, and Mohammed Marey. "A Kalman-filter-based method for pose estimation in visual servoing." IEEE Transactions on Robotics, vol.26, pp.939-947, Oct. 2010.
[44] C. Wang, C. Y. Lin, and M. Tomizuka. “Statistical learning algorithms to compensate slow visual feedback for industrial robots,” Journal of Dynamic Systems, Measurement, and Control, vol.137 Mar. 2015.
[45] M. Marshall and H. Lipkin, “Adaptive Kalman filter control law for visual servoing,” in Proc. of the International Conference on Collaboration Technologies and Systems (CTS), Orlando, FL, USA, 2016.
[46] R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, 2nd ed., New York, NY : Cambridge Univ. Press, 2004.
[47] Wikipedia. Pinhole camera - Wikipedia. Retrieved May. 5, 2018, from https://
en.wikipedia.org/wiki/Pinhole_camera
[48] 林潔君,基於視覺之工業用機械手臂物件夾取研究,碩士論文,國立成功大學,電機工程學系,台灣,2015。
[49] R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE Journal on Robotics and Automation, vol.3, pp.323-344, Aug. 1987.
[50] Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on pattern analysis and machine intelligence, vol.22, pp.1330-1334, Nov. 2000.
[51] Open Source Computer Vision. OpenCV: Camera Calibration. Retrieved May. 5, 2018, from https://docs.opencv.org/3.1.0/dc/dbb/tutorial_py_calibration.html
[52] Wikipedia. Camera resectioning - Wikipedia. Retrieved May. 5, 2018, from https://en.wikipedia.org/wiki/Camera_resectioning
[53] 蔡弘晉,基於單應性矩陣之三維模型重建法應用於六軸關節型機械手臂,碩士論文,國立成功大學,電機工程學系,台灣,2014。
[54] K. S. Fu, R. C. Gonzalez, and C. S. G. Lee, “Robot arm kinematics,” in Robotics: Control, Sensing, Vision and Intelligence. New York, McGraw-Hill, 1987.
[55] M. W. Spong, S. Hutchinson, and M. Vidyasagar, “Rigid motions and homogeneous transformations,” in Robot Modeling and Control. Hoboken, John Wiley & Sons, 2006.
[56] Wikipedia. Skew-symmetric matrix - Wikipedia. Retrieved May. 5, 2018, from https://en.wikipedia.org/wiki/Skew-symmetric_matrix
[57] S. Remy et al., “Hand-eye calibration,” in Proc. of the IEEE/RSJ Int. Conf. Intelligent Robots and Syst., Grenoble, 1997, pp.1057-1065.
[58] Y. Motai and A. Kosaka, “Hand–eye calibration applied to viewpoint selection for robotic vision,” IEEE Trans. Ind. Electro., vol. 55, pp. 3731-3741, Mar. 2008.
[59] 江宗錡,六軸關節型機械手臂之手眼校正研究,碩士論文,國立成功大學,電機工程學系,台灣,2014。
[60] C. Cai, N. Somani, S. Nair, D. Mendoza, and A. Knoll, “Uncalibrated stereo visual servoing for manipulators using virtual impedance control,” in Proc. of the 13th International Conference on Control Automation Robotics & Vision, Singapore, Dec. 2014, pp. 1888-1893.
[61] Wikipedia. Singular-value decomposition - Wikipedia. Retrieved May. 5, 2018, from https://en.wikipedia.org/wiki/Singular-value_decomposition
[62] The Open Academy. Introduction to Robotics. Retrieved May. 5, 2018, from https://theopenacademy.com/content/introduction-robotics
[63] J. J. Craig, Introduction to Robotics: Mechanics and Control, 3rd ed, Boston, Addison-Wesley, 1989.
[64] L. Sciavicco and B. Siciliano, Modeling and Control of Robot Manipulars, London, Springer Verlag, 1996.
[65] 吳如峰,工業用六軸機械手臂之基於影像視覺伺服架構研究,碩士論文,國立成功大學,電機工程學系,台灣,2015。
[66] 陳昭仁,基於觀測器之阻抗控制與被動式速度控制於手臂健身/復健裝置之應用,碩士論文,國立成功大學,電機工程學系,台灣,2013。
[67] 胡智皓,選擇順應性裝配機械手臂之外力估測與順應控制研究,碩士論文,國立成功大學,電機工程學系,台灣,2016。
[68] A. Muis and K. Ohnishi, “Eye-to-hand approach on eye-in-hand configuration within real-time visual servoing,” IEEE/ASME Trans. Mechatron.,vol. 10, pp. 404-410, Sep. 2005.
[69] Wikipedia. Axis–angle representation - Wikipedia. Retrieved May. 5, 2018, from https://en.wikipedia.org/wiki/Axis-angle_representation
[70] Wikipedia. Euler angles - Wikipedia. Retrieved May. 5, 2018, from https://en.wikipedia.org/wiki/Euler_angles
[71] M. W. Spong, S. Hutchinson, and M. Vidyasagar, Robot Dynamics and Control, 2nd ed. New York, USA : John Wiley & Sons, 2004.
[72] R. E. Kalman, “A new approach to linear filtering and prediction problems,” Journal of basic Engineering, vol.82, pp.35-45, 1960.
[73] É. Marchand and F. Chaumette, “Virtual Visual Servoing: a framework for real‐time augmented reality,” Computer Graphics Forum, vol.21, pp.289-297, Blackwell Publishing, Inc, 2002.
[74] Wikipedia. Augmented reality - Wikipedia. Retrieved May. 5, 2018, from https://en.wikipedia.org/wiki/Augmented_reality
[75] Wikipedia. Rodrigues' rotation formula - Wikipedia. Retrieved May. 5, 2018, from https://en.wikipedia.org/wiki/Rodrigues%27_rotation_formula
[76] J. L. Meriam and L. G. Kraige, Engineering mechanics : dynamics, 7th ed. John Wiley & Sons, 2012.
[77] A. Almagbile, J. Wang, and W. Ding, “Evaluating the performances of adaptive Kalman filter methods in GPS/INS integration,” Journal of Global Positioning Systems, vol.9, pp.33-40, 2010.
[78] H. Wang, Z. Deng, B. Feng, H. Maan, and Y. Xia, “An adaptive Kalman filter estimating process noise covariance,” Neurocomputing, vol.223, pp.12-17, Feb. 2017.
[79] C. H. An, C. G. Atkeson, J. Griffiths, and J. M. Hollerbach, “Experimental evaluation of feedforward and computed torque control,” IEEE Transactions on Robotics and Automation, vol.5, pp. 368–373, Jun. 1989.
[80] R.H. Brown, S.C. Schneider, and M.G. Mulligan, “Analysis of algorithms for velocity estimation from discrete position versus time data,” IEEE Transactions on Industrial Electronics, vol.39, pp.11-19, Feb. 1992.
[81] C. G. Atkeson, C. H. An, and J. M. Hollerbach, “Estimation of inertial parameters of manipulator loads and links,” International Journal of Robotics Research, vol. 5, pp.101-119, Sep. 1986.
[82] J. Swevers, W. Verdonck, and J. D. Schutter, “Dynamic Model Identification for Industrial Robots,” IEEE Control Systems, vol. 27, pp. 58-71, Oct. 2007.
[83] R. T. Farouki, B. K. Kuspa, C. Manni, and A. Sestini, “Efficient solution of the complex quadratic tridiagonal system for PH quintic splines,” Numer. Algorithms, vol. 90, pp. 35-60, 2001.
[84] 謝尚勳,基於視覺之PH 雲形線平面運動軌跡插值器之研究,碩士論文,國立成功大學,電機工程學系,台灣,2010。
[85] S. Garrido-Jurado, R. Muñoz-Salinas, F. J. Madrid-Cuevas and M. J. Marín-Jiménez, “Automatic generation and detection of highly reliable fiducial markers under occlusion,” Pattern Recognition, vol.47, pp.2280-2292, Jun. 2014.
[86] Open Source Computer Vision. OpenCV: Detection of ArUco Markers. Retrieved May. 5, 2018, from https://docs.opencv.org/3.1.0/d5/dae/tutorial
_aruco_detection.html
校內:2023-06-19公開