| 研究生: |
田益明 Tien, Yi-Ming |
|---|---|
| 論文名稱: |
基於控制里亞普諾夫函數之工業用機械手臂視覺伺服控制研究 Study on Visual Servoing Control for Industrial Robot Manipulator based on Control Lyapunov Function |
| 指導教授: |
鄭銘揚
Cheng, Ming-Yang |
| 學位類別: |
碩士 Master |
| 系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
| 論文出版年: | 2025 |
| 畢業學年度: | 113 |
| 語文別: | 中文 |
| 論文頁數: | 136 |
| 中文關鍵詞: | 視覺伺服 、視覺追蹤 、控制里亞普諾夫函數 、工業用機器人 |
| 外文關鍵詞: | Visual Servoing, Visual Tracking, Control Lyapunov Function, Industrial Robot |
| 相關次數: | 點閱:35 下載:2 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
由於機器人技術的快速發展,電腦視覺結合機器人的應用已是目前熱門的研究主題。其中,視覺伺服技術因其能夠利用視覺資訊作為回授訊號,使機器人具備感知外部環境變化的能力,展現出顯著優勢,目前已被廣泛應用於工業自動化、醫療手術、無人機導引等多個領域。本論文旨在使用控制里亞普諾夫函數技術設計一類六自由度機械手臂的視覺伺服控制方法,其中透過目標函數的設計以處理視覺空間與機械手臂關節空間交互作用矩陣之奇異性對於控制性能的影響,並透過控制里亞普諾夫函數之限制條件保持其收斂性。除此之外,透過設計視覺伺服干擾觀測器對目標物移動以及運動學誤差等外部擾動進行觀測並將其納入系統動態中,開發應對移動目標物的視覺伺服控制方法。最後,透過模擬與實際實驗驗證本論文所提方法之有效性。
With the rapid advancement of robotics technology, the integration of vision systems with robotic platforms has become a widely studied topic. The visual servoing technique, in particular, has demonstrated significant advantages by utilizing visual information as feedback, enabling robots to perceive and respond to changes in their external environment. As a result, visual servoing has been widely applied in various domains, including industrial automation, surgical robotics, and UAV guidance. This thesis aims to develop a visual servoing structure for a 6-DOF robot manipulator using the Control Lyapunov Function (CLF) framework. Through the design of an objective function, the effect of singularities in the interaction matrix between the image space and the robot’s joint space on control performance is addressed, while convergence is ensured by incorporating CLF constraints. Furthermore, a visual servo disturbance observer is designed to estimate external disturbances such as target motion and kinematic uncertainties. The visual servo disturbance observer is included in the design of visual servoing structure to enhance robustness in tracking the moving target. The proposed approaches are validated through simulations and experiments conducted on a real 6-DOF industrial robot manipulator.
[1] 2024/2025 產業技術白皮書,經濟部技術處,2024
[2] W.-C. Chang, “Robotic assembly of smartphone back shells with eye-in-hand visual servoing,” Robot. Comput. Integr. Manuf., vol. 50, pp. 102–113, Apr. 2018.
[3] J. Luo, L. Zhu, L. Li, and P. Hong, “Robot visual servoing grasping based on top-down keypoint detection network,” IEEE Trans. Instrum. Meas., vol. 73, pp. 1–11, 2024.
[4] 張庭育,虛擬視覺伺服估測器及動態視覺伺服架構之研究,碩士論文,國立成功大學,電機工程學系,臺灣,2018。
[5] C. L. Li, M. Y. Cheng and W. C. Chang, “Dynamic Performance Improvement of Direct Image-based Visual Servoing in Contour Following,” Int. J. Adv. Robot. Syst., vol. 15, no. 1, Jan. 2018.
[6] M. Huber, “Homography-based visual servoing with remote center of motion for semi-autonomous robotic endoscope manipulation,” in Proc. Int. Symp. Med. Robot., 2021, pp. 1–7.
[7] E. Iovene, “Towards exoscope automation in neurosurgery: A markerless visual-servoing approach,” IEEE Trans. Med. Robot. Bionics, vol. 5, no. 2, pp. 411–420, May 2023.
[8] 賴彥均,應用Q-Learning於搭載機械手臂自走車系統之基於影像視覺伺服研究,國立成功大學,碩士論文,電機工程學系,臺灣,2017。
[9] Y. Lin and K. Xing, “Visual servo optimization stabilization of nonholonomic mobile robots based on control Lyapunov functions,” Meas. Control, vol. 53, nos. 9–10, pp. 1825–1831, Nov. 2020.
[10] J. Hill, “Real time control of a robot with a mobile camera,” in Proc. 9th Int. Symp. on Industrial Robots, 1979, pp. 233–245.
[11] S. Hutchinson, G.D. Hager and P.I. Corke, “A Tutorial on Visual Servo Control,” IEEE T-Robotics and Automation, vol. 12, no. 5, pp. 651–670, 1996.
[12] H. Wang, B. Yang, J. Wang, X. Liang, W. Chen, and Y. Liu, “Adaptive visual servoing of contour features,” IEEE/ASME Trans. Mechatronics, vol. 23, no. 2, pp. 811–822, Apr. 2018.
[13] 張維哲,基於視覺伺服之未知物體輪廓循跡控制研究,碩士論文,國立成功大學,電機工程學系,臺灣,2013。
[14] C. Cao, O. Qi, H. Su, L. Liu , X. Jia, “Investigation of IBVS control method utilizing vanishing vector subject to spatial constraint,” Measurement, vol. 220, Oct. 2023.
[15] 戴嘉潁,基於不同特徵點選取方式之視覺伺服架構軌跡規劃研究,碩士論文,國立成功大學,電機工程學系,臺灣,2021。
[16] M. Keshmiri and W.-F. Xie, “Image-based visual servoing using an optimized trajectory planning technique,” IEEE/ASME Trans. Mechatronics, vol. 22, no. 1, pp. 359–370, Feb. 2017.
[17] X. Liu, J. Mao, J. Yang, S. Li, and K. Yang, “Robust predictive visual servoing control for an inertially stabilized platform with uncertain kinematics,” ISA Trans., vol. 114, pp. 347–358, 2021.
[18] A. Hajiloo, M. Keshmiri, W. F. Xie, and T. T. Wang, “Robust online model predictive control for a constrained image-based visual servoing,” IEEE Trans. Ind. Electron., vol. 63, no. 4, pp. 2242–2250, Apr. 2016.
[19] A. Paolillo, M. Forgione, D. Piga, E. M. Hoffman, “Fast predictive visual servoing: A reference governor-based approach, ” Control Engineering Practice, Volume 136, 2023, 105521, ISSN 0967-0661.
[20] F. Li and H.-L. Xie, “Sliding mode variable structure control for visual servoing system,” Int J. Automat, Comput., vol. 7, no. 3, pp. 317–323, 2010.
[21] S. Lee and D. Chwa, “Dynamic image-based visual servoing of monocular camera mounted omnidirectional mobile robots considering actuators and target motion via fuzzy integral sliding mode control,” IEEE Trans. Fuzzy Syst., vol. 29, no. 7, pp. 2068–2076, Jul. 2021.
[22] R. Prakash and L. Behera, “Neural optimal control for constrained visual servoing via learning from demonstration,” IEEE Trans. Autom. Sci. Eng., pp. 1–14, 2023.
[23] M. Kang, H. Chen, and J. Dong, “Adaptive visual servoing with an uncalibrated camera using extreme learning machine and Q-leaning,” Neurocomputing, vol. 402, pp. 384–394, 2020.
[24] X. Ren, H. Li, and Y. Li, “Image-based visual servoing control of robot manipulators using hybrid algorithm with feature constraints,” IEEE Access, vol. 8, pp. 223495–223508, 2020.
[25] N. Han, X. Ren, and D. Zheng, “Visual servoing control of robotics with a neural network estimator based on spectral adaptive law,” IEEE Trans. Ind. Electron., vol. 70, no. 12, pp. 12586–12595, Dec. 2023.
[26] H. Shi, H. Wu, C. Xu, J. Zhu, M. Hwang, and K.-S. Hwang, “Adaptive image-based visual servoing using reinforcement learning with fuzzy state coding,” IEEE Trans. Fuzzy Syst., vol. 28, no. 12, pp. 3244–3255, Dec. 2020.
[27] C. Sampedro, A. Rodriguez-Ramos, I. Gil, L. Mejias, and P. Campoy, “Image-based visual servoing controller for multirotor aerial robots using deep reinforcement learning,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2018, pp. 979–986.
[28] Z. Li, Y. Zhou, L. Wang, X. Zhang, A. Li, M. Zhu, and Q. Wu, “An end-to-end controller with image-based visual servoing of industrial manipulators with soft-actor-critic algorithm,” Knowledge-Based Syst., vol. 311, Feb. 2025.
[29] E. Sontag, “A ‘universal’ contruction of Artsteins theorem on nonlinear stabilization,” Syst. Control Lett., vol. 13, pp. 117–123, 1989.
[30] K. Galloway, K. Sreenath, A. D. Ames, and J. W. Grizzle, “Torque saturation in bipedal robotic walking through control Lyapunov function-based quadratic programs,” IEEE Access, vol. 3, pp. 323–332, 2015.
[31] Q. Nguyen and K. Sreenath, “L 1 adaptive control for bipedal robots with control Lyapunov function based quadratic programs,” in Proc. Amer. Control Conf., 2015, pp. 862–867.
[32] J. Umlauft, L. Pöhler, and S. Hirche, “An uncertainty-based control Lyapunov approach for control-affine systems modeled by Gaussian process,” IEEE Control Syst. Lett., vol. 2, no. 3, pp. 483–488, Jul. 2018.
[33] Y. Wu, L. Liu, Y. Yang, and S. Dai, “Optimal control method for robot-tracking based on control-Lyapunov-function,” IEEE Access, vol. 7, pp. 90565–90573, 2019.
[34] 黃嘉浚,基於控制障礙函數之線上動態運動原語避障軌跡修正研究,碩士論文,國立成功大學,電機工程學系,臺灣,2024。
[35] X. Song and M. Fu, “CLFs-based optimization control for a class of constrained visual servoing systems,” ISA Trans., vol.67, pp. 507-514, Mar. 2017.
[36] F. Chaumette and S. Hutchinson, “Visual servo control. II. Advanced approaches,” IEEE Robot. Autom. Mag., vol. 14, no. 1, pp. 109–118, Mar. 2007.
[37] R. Felici, M. Saveriano, L. Roveda, and A. Paolillo, “Imitation learning-based visual servoing for tracking moving objects,” International Workshop on Human-Friendly Robotics, Springer (2023), pp.110-122.
[38] S. Wei, B. Dai, R. Khorrambakht, P. Krishnamurthy, and F. Khorrami, “DiffOcclusion: Differentiable optimization based control barrier functions for occlusion-free visual servoing,” IEEE Robot. Automat. Lett., vol. 9, no. 4, pp. 3235–3242, Apr. 2024.
[39] M. W. Spong, and M. Vidyasagar, Robot Dynamics and Control, New York, NY, USA : Wiley, 1989.
[40] A. A. Maciejewski and C. A. Klein, “Numerical filtering for the operation of robotic manipulators through kinematically singular configurations,” J. Robot. Syst., vol. 5, no. 6, pp. 527-552, 1988.
[41] 黃浩倫,基於演化式演算法之工業用機械手臂動態參數鑑別研究,博士論文,國立成功大學,電機工程學系,臺灣,2024。
[42] S.-H. Lee and J.-B. Song, “Acceleration estimator for low-velocity and low-acceleration regions based on encoder position data,” IEEE/ASME Trans. Mechatronics, vol. 6, no. 1, pp. 58-64, Mar. 2001.
[43] R. E. Kalman and R. S. Bucy, “New results in linear filtering and prediction theory,” J. Basic Eng., vol. 83, no. 1, pp. 95–108, Mar. 1961, doi: 10.1115/1.3658902.
[44] R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision. Cambridge, U.K. : Cambridge Univ. Press, 2003.
[45] Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, no. 11, pp. 1330–1334, Nov. 2000.
[46] L. Chen, G. Zhong, Z. Wan, X. Liang, and H. Pan, “A novel binocular vision-robot hand-eye calibration method using dual nonlinear optimization and sample screening,” Mechatronics, vol. 96, Dec. 2023.
[47] F. Chaumette and S. Hutchinson, “Visual servo control part i: Basic approaches,” IEEE Robot. Autom. Mag., vol. 13, no. 4, pp. 82-90, Dec. 2006.
[48] 李哲良,基於影像之視覺伺服應用於循跡控制研究,碩士論文,國立成功大學,電機工程學系,臺灣,2016。
[49] E. Malis, Contributions à la modélization et à la commande en asservissement visuel, Ph.D. dissertation Univ. Rennes I, IRISA Paris, France, Nov. 1998.
[50] E. Malis and P. Rives, “Robustness of image-based visual servoing with respect to depth distribution errors,” in Proc. IEEE Int. Conf. Robot. Autom., pp. 1056-1061, 2003.
[51] Freeman, R. A. and Primbs, J. A., “Control Lyapunov functions: New ideas from an old source,” in Proc. 35th IEEE Conf. Decision and Control, Kobe, Japan, 1996, pp. 3926–3931.
[52] Open Problems in Mathematical Systems and Control Theory, London: Springer-Verlag, 1999.
[53] F. Quan, H. Chen, Y. Li, Y. Lou, J. Chen, and Y. Liu, “Singularity-robust hybrid visual servoing control for aerial manipulator,” in Proc. IEEE Int. Conf. Robot. Biomimetics, 2018, pp. 562–568.
[54] G. Rotithor, D. Trombetta, R. Kamalapurkar, and A. P. Dani, “Full- and reduced-order observers for image-based depth estimation using concurrent learning,” IEEE Trans. Control Syst. Technol., vol. 29, no. 6, pp. 2647–2653, Nov. 2021.
[55] E. Rohmer, S. P. Singh, and M. Freese, “V-REP: A versatile and scalable robot simulation framework,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2013, pp. 1321–1326.
[56] S. Garrido-Jurado, R. Muñoz-Salinas, F. J. Madrid-Cuevas, and M. J. Marín-Jiménez, “Automatic generation and detection of highly reliable fiducial markers under occlusion,” Pattern Recognit., vol. 47, no. 6, pp. 2280–2292, 2014.
[57] G. Bradski, "The OpenCV Library," Dr. Dobb's Journal of Software Tools, 2000.