簡易檢索 / 詳目顯示

研究生: 楊唯駿
Yang, Wei-Chun
論文名稱: 基於 FPGA 的即時立體視覺系統之拍擊耍弄機器人的設計與實現
Design and Implementation of a Paddle Juggling Robot with an FPGA-based Real-time Stereo Vision System
指導教授: 何明字
Ho, Ming-Tzu
學位類別: 碩士
Master
系所名稱: 工學院 - 工程科學系
Department of Engineering Science
論文出版年: 2023
畢業學年度: 111
語文別: 中文
論文頁數: 362
中文關鍵詞: 立體視覺拍擊耍弄腕關節型機器人卡門濾波器
外文關鍵詞: stereo vision, paddle juggling, carpal wrist robot, Kalman filter
相關次數: 點閱:97下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本論文旨在延續並改進本實驗室所發展的腕關節型機器人拍擊系統,以立體視覺引導機器人擊球,達到連續拍擊球之目的。本系統由立體視覺系統與擊球控制系統兩個子系統組成,其中,立體視覺系統由兩組影像視覺系統構成,每組影像視覺系統各由一部魚眼相機與影像處理模組構成,影像視覺系統可以依據明亮度資訊判別球,經過影像處理後輸出目標物在影像上的座標;擊球控制系統則可以將立體視覺系統輸出的兩個像素座標換算為目標物在空間中的位置,並利用擴展卡門濾波器(extended Kalman filter)針對球的位置與速度進行估測,並利用質點拋體運動學對球的拍擊點與入射速度進行預測,用以決定機器人拍擊球體的時機、姿態與速度。在實作上,立體視覺系統利用以現場可程式化邏輯閘陣列(Field Programmable Gate Array, FPGA)為基礎之開發板做為影像處理核心,並使用硬體描述語言Verilog撰寫影像處理演算法與各種周邊功能。擊球控制系統以STM32微控制器作為控制核心,並使用C語言撰寫與影像處理模組間的通訊協議、擴展卡門濾波器、馬達控制器等演算法。兩系統整合後,能持續拍擊使球回彈至固定高度,並經常可連續拍擊達1500次以上,相較於上一代拍擊系統的150次有很大的進步。

    The main objective of this thesis is to continue and improve the paddle juggling system with a carpal wrist robot developed by our laboratory. The paddle juggling system uses the stereo visual system to guide the robot to hit the ball continuously. The paddle juggling system consists of two subsystems: the stereo visual system and the paddle controlling system. The stereo visual system consists of two image visual systems, and each image visual system consists of a fish-eye camera and a image processing module. The image visual system can identify the ball according to its luminance information and output the pixel coordinate of the ball after image processing. The paddle controlling system can transform two pixel coordinates, which are from the stereo visual system, into the position of the ball in space. The extended Kalman filter is then used to estimate the dynamics of the ball, predict its future placement and incidence velocity. Based on the dynamics of the ball estimated by the extended Kalman filter, the carpal wrist robot can determine the timing, velocity, position, and posture for hitting the ball. Image processing algorithms and peripheral funtions of the image visual system are implemented on an FPGA-based board through Verilog, a hardware description language. The communication protocol, extended Kalman filter and robot controller of the paddle controlling system are implemented on an STM32 microcontroller through the C language. The full integration of two subsystems allows the ball to bounce back to a fixed height, and the ball can often be hitten more than 1500 times continuously. This is a great improvement compared to the 150 strokes of the previous generation system.

    摘要 I Extended Abstract II 誌謝 VII 目錄 VIII 圖目錄 XIV 表目錄 XXV 第一章 緒論 1-1 研究背景與動機 1-1 1-2 研究目的 1-3 1-3 研究步驟 1-5 1-4 相關文獻回顧 1-7 1-5 論文架構 1-10 第二章 相機模型與空間座標計算 2-1 前言 2-1 2-2 針孔相機模型的內部參數 2-3 2-3 魚眼相機之投影模型與內部參數 2-8 2-3-1 魚眼相機的特性與幾何投影模型 2-8 2-3-2 多項式魚眼模型 2-16 2-4 魚眼影像扭曲校正 2-18 2-4-1 前言 2-18 2-4-2 數位影像 2-19 2-4-3 數位影像的魚眼影像扭曲校正 2-22 2-4-4 以 MATLAB 進行參數估測與扭曲校正 2-42 2-4-5 投影模型轉換與 PFET 內部參數估測 2-53 2-5 外部參數 2-56 2-5-1 前言 2-56 2-5-2 座標系的平移與旋轉 2-57 2-5-3 PnP 問題 2-61 2-6 針孔相機模型與目標物大地座標計算 2-62 第三章 腕關節型機器人介紹與運動分析 3-1 前言 3-1 3-2 腕關節型機器人介紹 3-1 3-3 運動學分析 3-3 3-3-1 正向運動學分析 3-7 3-3-2 逆向運動學分析 3-12 第四章 基礎動力學與接觸力學簡介 4-1 前言 4-1 4-2 牛頓運動定律 4-1 4-3 碰撞 4-2 4-3-1 恢復係數 4-4 4-3-2 正碰撞 4-7 4-3-3 偏心碰撞 4-8 4-4 偏心碰撞影響球與擊球板之偏移 4-12 4-5 接觸力學應用 4-15 4-5-1 連續接觸力模型之赫茲接觸理論 4-15 4-5-2 連續接觸力模型之等效彈簧阻尼模型 4-18 第五章 質點軌跡預測 5-1 前言 5-1 5-2 卡門濾波器 5-1 5-3 質點拋體運動學 5-10 第六章 擊球控制器設計 6-1 前言 6-1 6-2 座標系定義 6-1 6-3 拍擊函數 6-4 6-4 預測拍擊點與預測入射速度之計算 6-12 6-5 拍擊策略 6-15 6-5-1 垂直擊球 6-16 6-5-2 修正距離擊球 6-18 6-6 沉降動作之路徑規劃 6-20 6-7 擊球時間點分析 6-32 6-8 擊球流程 6-33 6-9 永磁式直流馬達數學模型與參數識別 6-43 6-10 PID 控制器 6-49 第七章 機構設計及製作與系統硬體架構 7-1 前言 7-1 7-2 腕關節型機器人設計及製作 7-2 7-3 相機 7-5 7-4 DE0-CV 影像處理模組 7-9 7-5 STM32F407VG-DISC1 開發板(STM32 微控制器) 7-11 7-6 脈波寬度調變馬達驅動模組 7-12 第八章 影像視覺系統實現 8-1 前言 8-1 8-2 影像擷取 8-2 8-2-1 電源使能模組 8-2 8-2-2 SPI 暫存器設定模組 8-7 8-2-3 影像解串與像素排列模組 8-8 8-3 影像處理 8-13 8-3-1 去馬賽克模組 8-13 8-3-2 色彩空間轉換模組 8-17 8-3-3 二值化模組 8-18 8-3-4 魚眼影像扭曲校正模組(逆投影模組) 8-21 8-3-5 影像形態學模組 8-51 8-3-6 範圍選取模組 8-52 8-3-7 目標物重心計算模組 8-53 8-4 周邊功能 8-54 8-4-1 使用者操作介面模組 8-54 8-4-2 SDRAM 控制器模組 8-55 8-4-3 影像輸出-SDRAM-UART 模組 8-57 8-4-4 影像輸出-SDRAM-VGA 模組 8-58 8-4-5 影像輸出-VGA 模組 8-62 8-4-6 重心輸出-VGA 模組 8-63 8-4-7 相機同步模組 8-64 第九章 實驗結果 9-1 前言 9-1 9-2 系統實驗硬體架設 9-1 9-3 立體視覺系統實驗結果 9-6 9-3-1 影像處理驗證 9-6 9-3-2 球下落軌跡驗證 9-14 9-3-3 拍擊點資訊預測結合擴展卡門濾波實驗 9-16 9-4 腕關節型機器人控制實驗結果 9-24 9-4-1 偏航角控制性能驗證 9-24 9-4-2 俯仰角控制性能驗證 9-26 9-4-3 沉降速度控制性能驗證 9-29 9-5 拍擊系統實驗結果 9-36 9-5-1 擊球時間點驗證 9-36 9-5-2 擊球性能驗證 9-37 第十章 結論與未來展望 10-1 結論 10-1 10-2 未來展望 10-2 參考文獻 Ref-1 附錄 A A-1 附錄 B B-1

    [1] P. Kulchenko and E. Todorov, “First-exit Model Predictive Control of Fast Discontinuous Dynamics: Application to Ball Bouncing,” Proceedings of IEEE International Conference on Robotics and Automation, pp. 2144-2151, May 2011.
    [2] A. Nakashima, Y. Sugiyama, and Y. Hayakawa, “Paddle Juggling by Robot Manipulator with Visual Servo,” Robot Manipulators, Marco Ceccarelli (Ed.), ISBN: 978-953-7619-06-0, InTech, pp. 425-440, Sep. 2008.
    [3] A. A. Rizzi and D. E. Koditschek, “Further Progress in Robot Juggling: Solvable Mirror Laws,” Proceedings of IEEE International Conference on Robotics and Automation, Vol. 4, pp. 2935-2940, May 1994.
    [4] S. L. Canfield, Development of the Carpal Wrist: A Symmetric, ParallelArchitecture Robotic Wrist, Ph.D. Dissertation, Virginia Polytechnic Institute and State University, 1997.
    [5] S. L. Canfield, C. F. Reinholtz, R. J. Salerno, and A. J. Ganino, “Spatial, Parallel-Architecture Robotic Carpal Wrist,” U.S. Patent 5699695, 1997.
    [6] 蕭家豪,「以視覺為基礎之腕關節型機器人拍擊控制系統之設計與實現」,國立成功大學工程科學系碩士論文,民國一○五年一月。
    [7] TMS320DM6437 Digital Media Processor Datasheet, Texas Instrument, 2008.
    [8] 蔡學怡,「以影像為基礎之腕關節型機器人拍擊系統」,國立成功大學工程科學系碩士論文,民國一○八年七月。
    [9] X. S. Gao, X. R. Hou, J. L. Tang, and H. F. Cheng, “Complete Solution Classification for the Perspective-Three-Point Problem,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 25, No. 8, pp. 930-943, Aug. 2003.
    [10] S. Q. Ren, K. M. He, R. Grishick, and J. Sun, “Faster R-CNN: Towards RealTime Object Detection with Region Proposal Networks,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 39, No. 6, pp. 1137-1149, Jun. 2017.
    [11] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You Only Look Once: Unified, Real-Time Object Detection,” Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition, pp. 779-788, Jun. 2016.
    [12] L. C. Chen, G. Papandreou, I. Kokkinos, K. Murphy, and A. L. Yuille, “DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 40, No. 4, pp. 834-848, Apr. 2018.
    [13] M. X. Tan, R. M. Pang, and Q. V. Le, “EfficientDet: Scalable and Efficient Object Detection,” Proceedings of 2020 IEEE Conference on Computer Vision and Pattern Recognition, pp. 10778-10787, Jun. 2020.
    [14] A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. H. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, S. Gelly, J. Uszkoreit, and N. Houlsby, “An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale,” Proceedings of International Conference on Learning Representations 2021, Jun. 2021.
    [15] N. Carion, F. Massa, G. Synnaeve, N. Usunier, A. Kirillov, and S. Zagoruyko,“End-to-End Object Detection with Transformers,” Proceedings of 2021 IEEE/CVF International Conference on Computer Vision, pp. 2968-2977, Oct. 2021.
    [16] K. W. Duan, S. Bai, L. X. Xie, H. G. Qi, Q. M. Huang, and Q. Tian, “CenterNet: Keypoint Triplets for Object Detection,” Proceedings of 2019 IEEE/CVF International Conference on Computer Vision, pp. 6568-6577, Nov. 2019.
    [17] T. Y. Lin, P. Goyal, R. Girshick, K. M. He, and P. Dollar, “Focal Loss for Dense Object Detection,” Proceedings of 2017 IEEE International Conference on Computer Vision, pp. 2999-3007, Oct. 2017.
    [18] J. Hall and R. Williams, “Case Study: Inertial Measurement Unit Calibration Platform,” Journal of Robotic Systems, Vol. 17, No. 11, pp. 623-632, Nov. 2000.
    [19] M. Majji and M. Diz, “Vision Based Feedback Control of a Carpal Wrist Joint,” Proceedings of AIAA Guidance, Navigation, and Control (GNC) Conference, pp. 1-18, Aug. 2013.
    [20] J. Wrobel, R. Hoyt, J. Slostad, N. Storrs, J. Cushings, T. Moser, J. St. Luise, and G. Jimmerson, “PowerCube (TM) - Enhanced Power, Propulsion, and Pointing to Enable Agile, High-Performance CubeSat Missions,” Proceedings of AIAA SPACE 2012 Conference & Exposition, pp. 1-11, Sep. 2012.
    [21] C. Vaida, N. Plitea, B. Gherman, A. Szilaghyi, B. Galdau, D. Cocorean, F. Covaciu, and D. Pisla, “Structural Analysis and Synthesis of Parallel Robots for Brachytherapy,” New Trends in Medical and Service Robots, Vol. 16, pp. 191-204, Sep. 2014.
    [22] R. Mori, K. Hashimoto, E. Takagi, and E. Miyazaki, “Examination of Ball Lifting Task Using a Mobile Robot,” Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 1, pp. 369-374, Feb. 2004.
    [23] R. G. Sanfelice, A. R. Teel, and R. Sepulchre, “A Hybrid Systems Approach to Trajectory Tracking Control for Juggling Systems,” Proceedings of IEEE Conference on Decision and Control, pp. 5282-5287, Dec. 2007.
    [24] P. Reist and R. D’Andrea, “Bouncing an Unconstrained Ball in Three Dimensions with a Blind Juggling Robot,” Proceedings of IEEE Conference on Robotics and Automation, pp. 1774-1781, May 2009.
    [25] K. L. Poggensee, A. H. Li, D. Sotsaikich, B. Zhang, P. Kotaru, M. Mueller, and K. Sreenath, “Ball Juggling on the Bipedal Robot Cassie,” Proceedings of 2020 European Control Conference, pp. 875-880, May 2020.
    [26] T. Oka, N. Komura, and A. Namiki, “Ball Juggling Robot System Controlled by High-Speed Vision,” Proceedings of 2017 IEEE International Conference on Cyborg and Bionic Systems, pp. 91-96, Oct. 2017.
    [27] R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, 2nd ed., Cambridge University Press, New York, 2004.
    [28] C. Hughes, E. Jones, M. Glavin, and P. Denny, “Validation of Polynomial-based Equidistance Fish-Eye Models,” Proceedings of IET Irish Signals and Systems Conference 2009, pp. 1-6, Jun. 2009.
    [29] Y. Liu, C. Tian, and Y. Huang, “Critical Assessment of Correction Methods for Fisheye Lens Distortion,” The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol. XLI-B1, pp. 221-228, Jul. 2016.
    [30] D. Scaramuzza and R. Siegwart, “A Toolbox for Easily Calibrating Omnidirectional Cameras,” Proceedings of 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 5695-5701, Oct. 2006.
    [31] N. Bellas, S. M. Chai, M. Dwyer, and D. Linzmeier, “Real-Time Fisheye Lens Distortion Correction Using Automatically Generated Streaming Accelerators,” Proceedings of 2009 17th IEEE Symposium on Field Programmable Custom Computing Machines, pp. 149-156, Apr. 2009.
    [32] R. C. Gonzalez and R. E. Woods, Digital Image Processing, 4th ed., Pearson, New York, 2018.
    [33] MATLAB 魚眼影像相關函式 https://www.mathworks.com/help/vision/ug/fisheye-calibration-basics.html
    [34] MATLAB 相機 PnP 求解函式 https://www.mathworks.com/help/vision/ref/estimateworldcamerapose.html
    [35] S. L. Altman, Rotations, Quaternions, and Double Groups, Dover Publications, Mineola, New York, 1986.
    [36] K. M. Walker, Applied Mechanics, Prentice Hall Press, Upper Saddle River, New Jersey, 1999.
    [37] 張居強,「以視覺伺服為基礎之舉球控制系統之研製」,國立成功大學工程科學系碩士論文,民國九十八年七月。
    [38] 林柏瑋,「以視覺為基礎之擊球控制系統之設計與實現」,國立成功大學工程科學系碩士論文,民國九十九年七月。
    [39] K. L. Johnson, Contact Mechanics, Cambridge University Press, New York, 1985.
    [40] W. Goldsmith, Impact-the Theory and Physical Behaviour of Colliding Solids,Edward Arnold Ltd, London, 1960.
    [41] K. H. Hunt and F. R. E. Grossley, “Coefficient of Restitution Interpreted as Damping in Vibroimpact,” ASME Journal of Applied Mechanics, Vol. 42, No. 2, pp. 440-445, Jun. 1975.
    [42] R. E. Kalman, “A New Approach to Linear Filtering and Prediction Problems,” Journal of Basic Engineering, Vol. 82, No. 1, pp. 35-45, Mar. 1960.
    [43] A. H. Jazwinski, Stochastic Processes and Filtering Theory, Academic Press, New York, 1970.
    [44] P. J. Davis and P. Rabinowitz, Methods of Numerical Integration, Academic Press, New York, 1984.
    [45] 洪介仁,「車與桿倒單擺系統之平衡控制」,國立成功大學工程科學系碩士論文,民國九十二年七月。
    [46] 蕭景隆,「線性馬達驅動控制系統之設計與實現」,國立成功大學工程科學系碩士論文,民國一○三年一月。
    [47] NOIP1SN1300A PYTHON 1.3/0.5/0.3 MegaPixels Global Shutter CMOS Image Sensors Datasheet, ON Semiconductor, 2015.
    [48] ON Semiconductor PYTHON 1300-C CAMERA MODULE Hardware User Guide, Avnet, 2015.
    [49] MPL Megapixel Lenses Datasheet, Arecont Vision, 2018.
    [50] DE0-CV User Manual, Terasic Inc., 2015.
    [51] Cyclone V Device Datasheet, Intel, 2019.
    [52] Cyclone V Device Handbook Volume 1: Interfaces and Intergration, Intel, 2019.
    [53] UM1472 User Manual Discovery Kit with STM32D407VG MCU, STMicroelectronics, 2020.
    [54] RM0090 Reference Manual STM32F405/415, STM32F407/417, STM32F427/437 and STM32F429/439 Advanced Arm-based 32-bit MCUs, STMicroelectronics, 2021.
    [55] A3941 Automotive Full Bridge MOSFET Driver, Allegro Inc., 2008.
    [56] SCFIFO and DCFIFO IP Cores User Guide, Intel, 2017.
    [57] 簡彰億,「以 DSP 為基礎於複雜背景中之視覺引導全像機器人之研製」,國立成功大學工程科學系碩士論文,民國九十九年七月。
    [58] 盛守鳴,「 馬達速度控制之速度估測器設計與實現」,國立成功大學工程科學系碩士論文,民國一○七年七月。
    [59] 王翊,「以影像為基礎之拋體軌跡預測及攔接」,國立成功大學工程科學系碩士論文,民國一○四年七月。
    [60] J. Sauvola, T. Seppanen, S. Haapakoski, and M. Pietikainen, “Adaptive Document Binarization,” Proceeding of the 4th International Conference on Document Analysis and Recognition, pp. 147-152, Aug. 1997.
    [61] T. R. Singh, S. Roy, O. I. Singh, T. Sinam, and K. M. Singh, “A New Local Thresholding Technique in Binarization,” International Journal of Computer Science Issues, Vol. 8, No. 2, Nov. 2011.
    [62] ALTERA_CORDIC IP Core User Guide, Intel, 2017.
    [63] 楊宗諭,「以顏色為基礎之多相機追蹤控制系統設計與實現」,國立成功大學工程科學系碩士論文,民國一○一年七月。
    [64] IS42/45R86400D/16320D/32160D, IS42/45S86400D/16320D/32160D, 16Mx32, 32Mx16, 64Mx8, 512Mb SDRAM Datasheet, ISSI., 2015.
    [65] D. T. Greenwood, Advanced Dynamics, Cambridge University Press, New York, 2003.
    [66] R. M. Rogers, Applied Mathematics in Integrated Navigation Systems (AIAA Education Series), 2nd ed., AIAA, Washington, D.C., 2003.
    [67] C. F. Kao and T. L. Chen, “Design and Analysis of an Orientation Estimation System Using Coplanar Gyro-free Inertial Measurement Unit and Magnetic Sensors,” Sensors and Actuators A, Vol. 144, No. 2, pp. 251-262, 2008.
    [68] D. Mortari, M. Angelucci, and F. L. Markley, “Singularity and Attitude Estimation,” Proceedings of the 10th Annual AIAA/AAS Space Flight Mechanics Meeting, paper No. AAS 00-130, 2000.
    [69] H. Goldstein, Classical Mechanics, 3rd ed., Addison Wesley, Boston, 1980.
    [70] D. Koks, Explorations in Mathematical Physics, Springer, New York, 2006.
    [71] F. A. Morrison, An Introduction to Fluid Mechanics, Cambridge, New York, 2013.

    無法下載圖示 校內:2028-08-17公開
    校外:2028-08-17公開
    電子論文尚未授權公開,紙本請查館藏目錄
    QR CODE