簡易檢索 / 詳目顯示

研究生: 吳佳沅
Wu, Chia-Yuan
論文名稱: 以點雲分割技術進行焊道辨識與機械手臂研磨路徑規劃
Weld Bead Recognition and Robotic Grinding Path Planning Based on Point Cloud Segmentation
指導教授: 鍾俊輝
Chung, Chun-Hui
學位類別: 碩士
Master
系所名稱: 工學院 - 機械工程學系
Department of Mechanical Engineering
論文出版年: 2025
畢業學年度: 113
語文別: 中文
論文頁數: 117
中文關鍵詞: 焊道辨識點雲處理深度學習機械手臂研磨加工路徑規劃
外文關鍵詞: Weld Bead Recognition, Point Cloud Processing, Deep Learning, Robotic Grinding, Tool Path Planning
相關次數: 點閱:40下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 使用人工研磨焊道時產生的金屬粉塵會對人體產生危害,而機械手臂因其靈活度高而被視為可行的替代方案被導入研磨製程中。在基於機械手臂的焊道研磨任務中,即使相同造型的工件也會因為焊接人員或設備等因素造成焊道的幾何尺寸及位置具有不確定性,導致預先規劃的研磨路徑無法精確對準而影響其研磨品質,為了提升自動化程度,許多智慧辨識技術被應用於加工區域的辨識,其中三維點雲因能精確表示工件的三維幾何特徵而被作為輸入資料以進行焊道辨識與研磨路徑規劃。過去的研究大多使用雷射線掃描器獲得工件點雲,其缺點是須先依照焊道的分布規劃工件的掃描路徑以及根據掃描方向來提取焊道特徵作為研磨路徑點,若需掃描不同類型的焊道時則須重新規劃掃描路徑,且面對高複雜度的焊道形狀時可能因三維資訊不足而無法有效地提取焊道特徵。因此本研究提出一種基於三維點雲及深度學習的高自適應性焊道辨識與研磨路徑規劃方法,在第一部分使用深度相機直接掃描自由曲面工件上的焊道點雲後透過泊松表面重建等方法對點雲進行前處理,接著使用處理後的點雲建立資料集並訓練PointNet模型以直接對整個工件點雲進行焊道辨識;在第二部分使用形態學以及B樣條曲線等方法產生平滑的研磨路徑後透過基於工件曲面法向的向量叉積計算機械手臂研磨姿態,最後透過基於點雲對齊技術的手眼校正方法將研磨路徑由深度相機座標系變換至機械手臂末端座標系;此外還提出了一種用於產生多道次研磨路徑的曲面焊道高度估計方法。實驗結果如下: (1)訓練的PointNet模型對工件點雲進行焊道辨識的平均準確率達94.35%;(2)焊道高度估計結果與實際的焊道高度相比,平均絕對誤差(MAE)不超過0.07mm;(3)在研磨驗證實驗中,不同的焊道形狀經研磨後之剩餘高度不超過0.5mm,證實所提之研磨路徑規劃方法的有效性,本研究提出之焊道辨識與特徵提取方法完全不需考慮焊道的形狀與方向,與過去相關研究相比具有高度自適應性。

    Manual grinding of weld beads generates metal dust that can be harmful to human health. Robotic arms, with their flexibility, offer a safer and more efficient alternative. However, variations in weld bead geometry and position, even for identical parts, can result from differences in welding processes. These variations may cause misalignment with pre-planned grinding paths and affect the final quality. To enhance automation, 3D point clouds have been widely used to recognize weld beads due to their ability to capture precise geometric features. Traditional methods often rely on laser line scanners, which require predefined scanning paths and are limited in handling complex or varying bead shapes. This study proposes a highly adaptive approach for weld bead recognition and grinding path planning using 3D point clouds and deep learning. A depth camera captures the weld bead point cloud data on freeform surfaces, which is then preprocessed and used to train a PointNet model to recognize beads directly from the entire workpiece. Smooth grinding paths are generated using morphological operations and B-spline fitting. Robot tool postures are computed based on surface normals, and hand–eye calibration is used to align the path from the camera coordinate system to the robot coordinate system. A weld bead height estimation method is also introduced to enable multi-pass grinding. Experiments show that the trained PointNet model achieved an average accuracy of 94.35% in weld bead recognition for workpiece point clouds. The height estimation method yields a mean absolute error of less than 0.07 mm compared to the actual bead height. In the grinding validation tests, the residual height after grinding remains under 0.5 mm for various weld bead shapes. These results demonstrate the effectiveness and adaptability of the proposed method, which does not require prior knowledge of bead shape or direction.

    摘要 i 致謝 vii 目錄 viii 表目錄 xi 圖目錄 xii 第1章 緒論 1 1.1 研究背景 1 1.2 文獻回顧 2 1.2.1 機械手臂的加工路徑規劃 3 1.2.2 焊道辨識與量測 5 1.2.3 基於機械手臂的焊道研磨及加工系統 7 1.3 研究目的 10 1.4 本文架構 14 第2章 點雲處理、深度學習與路徑點產生方法 15 2.1 三維點雲資料 15 2.1.1 點雲獲取設備 15 2.2 點雲資料前處理 18 2.2.1 泊松表面重建(Poisson Surface Reconstruction) 19 2.2.2 DBSCAN聚類演算法(Density-based Spatial Clustering of Applications with Noise) 21 2.3 PointNet 22 2.4 研磨路徑點產生方法 24 2.4.1 點雲影像化與焊道特徵提取 24 2.4.2 路徑點排序演算法 26 2.4.3 三次B樣條曲線擬合(Cubic B-Spline Fitting) 27 2.4.4 道格拉斯-普克演算法(Douglas–Peucker Algorithm) 29 2.4.5 體素過濾(Voxel Filtering)與雙變量樣條曲面擬合(Bivariate Spline Surface Fitting) 30 2.4.6 三維路徑點投影 33 第3章 實驗規劃與設置 34 3.1 實驗材料與設備 35 3.2 點雲資料前處理 37 3.2.1 泊松表面重建與點雲重新採樣 37 3.2.2 點雲過濾 38 3.3 PointNet模型訓練與焊道辨識 38 3.3.1 資料集標籤(Labeling)與資料增生(Data Augmentation) 39 3.3.2 模型訓練 42 3.3.3 焊道辨識 42 3.4 機械手臂研磨路徑規劃 43 3.4.1 點雲影像化與焊道骨架提取 44 3.4.2 路徑點排序 45 3.4.3 研磨路徑平滑化 45 3.4.5 體素過濾與工件曲面擬合 48 3.4.6 三維路徑點投影 50 3.4.7 焊道高度估計 50 3.4.8 機械手臂研磨姿態計算 52 3.5 多道次研磨路徑輸出 55 第4章 焊道辨識與研磨路徑規劃結果 57 4.1 點雲前處理結果 57 4.2 PointNet焊道辨識模型 59 4.2.1 模型訓練結果 59 4.2.2 模型測試結果 61 4.2.3 PointNet模型表現總結 62 4.3 焊道特徵提取與路徑點排序結果 63 4.4 路徑點平滑與外插結果 65 4.5 路徑點投影結果 68 4.6 焊道高度估計結果 69 4.7 機械手臂多道次研磨路徑與姿態產生結果 72 第5章 機械手臂研磨系統整合 74 5.1 機械手臂研磨系統 74 5.2 手眼校正(Hand-Eye Calibration) 76 5.3 手眼校正特徵點提取 78 5.3.1 工具中心點校正 80 5.3.2 機械手臂座標系下的特徵點提取 81 5.3.3 深度相機座標系下的特徵點提取 82 5.3.4 點雲對齊與座標轉換 83 5.4 研磨設置與機械手臂運動模擬 85 5.5 焊道研磨結果 88 第6章 結論與未來展望 90 6.1 結論 90 6.2 未來展望 91 參考文獻 93

    [1] X. Ding, J. Qiao, N. Liu, Z. Yang, and R. Zhang, “Robotic grinding based on point cloud data: Developments, applications, challenges, and key technologies,” Int. J. Adv. Manuf. Technol., vol. 131, no. 7–8, pp. 3351–3371, 2024, doi: 10.1007/s00170-024-13094-w.
    [2] C. Li, X. Dun, L. Li, et al., “Vision-guided robot application for metal surface edge grinding,” SN Appl. Sci., vol. 5, p. 236, 2023, doi: 10.1007/s42452-023-05468-8.
    [3] L. Chen, A. Chen, Y. Kuang, S. Fan, and J. Qiu, “Vision-based mobile robotic grinding path generation for large castings,” in Proc. 2023 3rd Int. Conf. Robotics, Automation and Intelligent Control (ICRAIC), Zhangjiajie, China, 2023, pp. 213–217, doi: 10.1109/ICRAIC61978.2023.00045.
    [4] J. Ge, Z. Deng, Z. Li, T. Liu, R. Zhuo, and X. Chen, “Adaptive parameter optimization approach for robotic grinding of weld seam based on laser vision sensor,” Rob. Comput.-Integr. Manuf., vol. 82, p. 102540, 2023, doi: 10.1016/j.rcim.2023.102540.
    [5] M. Xiao, Y. Ding, Z. Fang, and G. Yang, “Contact force modeling and analysis for robotic tilted-disc polishing of freeform workpieces,” Precis. Eng., vol. 66, pp. 188–200, 2020, doi: 10.1016/j.precisioneng.2020.04.019.
    [6] B. Xu, Z. Wang, Z. Yang, Z. Zhu, X. Zhang, S. Yan, and H. Ding, “Modeling and optimization of non-uniform material removal in robotic compliant tilt disk grinding for flexible free-form surfaces,” SSRN, 2024, doi: 10.2139/ssrn.4946861.
    [7] G. Zhang, J. Wang, F. Cao, Y. Li and X. Chen, "3D curvature grinding path planning based on point cloud data," in Proc. 2016 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA), Auckland, New Zealand, 2016, pp. 1-6, doi: 10.1109/MESA.2016.7587150.
    [8] Y. Zhao, J. Zhao, L. Zhang, L. Qi, and Q. Tang, “Path planning for automatic robotic blade grinding,” in Proc. 2009 Int. Conf. Mechatronics and Automation (ICMA), Changchun, China, 2009, pp. 1556–1560, doi: 10.1109/ICMA.2009.5246534.
    [9] X. Xu, S. Ye, Z. Yang, S. Yan, and H. Ding, “Global optimal trajectory planning of mobile robot grinding for high-speed railway body,” in Intelligent Robotics and Applications, H. Liu et al., Eds., vol. 13457, Cham: Springer, 2022, pp. 563–574, doi: 10.1007/978-3-031-13835-5_44.
    [10] Y. Li, H. Chen, and N. Xi, “Automatic programming for robotic grinding using real time 3D measurement,” in Proc. 2017 IEEE 7th Annual Int. Conf. CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Honolulu, HI, USA, 2017, pp. 803–808, doi: 10.1109/CYBER.2017.8446097.
    [11] W. Guo, X. Huang, B. Qi, X. Ren, H. Chen, and X. Chen, “A novel versatile method for generating machining path directly from point cloud,” J. Manuf. Process., vol. 120, pp. 15–27, 2024, doi: 10.1016/j.jmapro.2024.04.019.
    [12] H.-L. Xie, J.-R. Li, Z.-Y. Liao, Q.-H. Wang, and X.-F. Zhou, “A robotic belt grinding approach based on easy-to-grind region partitioning,” J. Manuf. Process., vol. 56, pt. A, pp. 830–844, 2020, doi: 10.1016/j.jmapro.2020.03.051.
    [13] A. Tafuro, C. Ghalloub, A. M. Zanchettin, and P. Rocco, “Autonomous robotic polishing of free-form poly-surfaces: planning from scanning in realistic industrial setting,” Procedia Comput. Sci., vol. 232, pp. 2488–2497, 2024, doi: 10.1016/j.procs.2024.02.145.
    [14] Q. Zhang, Q. Wang, N. Wang, and J. Xie, “Kinect V2 camera based vision system for robotic grinding,” in Proc. 2023 IEEE Int. Instrumentation and Measurement Technology Conf. (I2MTC), Kuala Lumpur, Malaysia, 2023, pp. 1–6, doi: 10.1109/I2MTC53148.2023.10176053.
    [15] S. Diao, X. Chen, and J. Luo, “Development and experimental evaluation of a 3D vision system for grinding robot,” Sensors, vol. 18, no. 9, p. 3078, 2018, doi: 10.3390/s18093078.
    [16] H. Zhang, H. Chen, N. Xi, G. Zhang, and J. He, “On-line path generation for robotic deburring of cast aluminum wheels,” in Proc. 2006 IEEE/RSJ Int. Conf. Intelligent Robots and Systems (IROS), Beijing, China, 2006, pp. 2400–2405, doi: 10.1109/IROS.2006.281679.
    [17] K. Ma et al., “A path planning method of robotic belt grinding for workpieces with complex surfaces,” IEEE/ASME Trans. Mechatronics, vol. 25, no. 2, pp. 728–738, Apr. 2020, doi: 10.1109/TMECH.2020.2974925.
    [18] Z. Wang, L. Zou, J. Zhang, H. Li, W. Wang, and Y. Huang, “Tool axis vector optimization for robotic grinding based on measured point cloud of complex curved blade,” Adv. Eng. Inform., vol. 62, pt. B, p. 102716, 2024, doi: 10.1016/j.aei.2024.102716.
    [19] S. Lu, X. Shi, X. Tian, and Y. Liu, “Weld seam extraction of intersecting pipelines based on point cloud entropy,” in Proc. 2022 IEEE 11th Data Driven Control and Learning Systems Conf. (DDCLS), Chengdu, China, 2022, pp. 385–390, doi: 10.1109/DDCLS55054.2022.9858446.
    [20] Y. Geng, Y. Zhang, X. Tian, and L. Zhou, “A novel 3D vision-based robotic welding path extraction method for complex intersection curves,” Robot. Comput.-Integr. Manuf., vol. 87, p. 102702, 2024, doi: 10.1016/j.rcim.2023.102702.
    [21] L. Yang, Y. Liu, J. Peng, and Z. Liang, “A novel system for off-line 3D seam extraction and path planning based on point cloud segmentation for arc welding robot,” Robot. Comput.-Integr. Manuf., vol. 64, p. 101929, 2020, doi: 10.1016/j.rcim.2019.101929.
    [22] R. P. Manorathna et al., “Feature extraction and tracking of a weld joint for adaptive robotic welding,” in Proc. 2014 13th Int. Conf. Control Automation Robotics & Vision (ICARCV), Singapore, 2014, pp. 1368–1372, doi: 10.1109/ICARCV.2014.7064515.
    [23] Z. He, Z. Pei, E. Li, E. Zhou, Z. Huang, Z. Xing, and B. Li, “An image segmentation-based localization method for detecting weld seams,” Adv. Eng. Softw., vol. 194, p. 103662, 2024, doi: 10.1016/j.advengsoft.2024.103662.
    [24] N. Zhou, P. Jiang, S. Jiang, L. Shu, X. Ni, and L. Zhong, “An identification and localization method for 3D workpiece welds based on the DBSCAN point cloud clustering algorithm,” J. Manuf. Mater. Process., vol. 8, no. 6, p. 287, 2024, doi: 10.3390/jmmp8060287.
    [25] J. Yao, C. Qian, G. Yu, and Y. Zhang, “A 3D reconstruction method for sheet parts using binocular camera,” in Proc. 2023 IEEE Int. Conf. Mechatronics and Automation (ICMA), Harbin, China, 2023, pp. 663–668, doi: 10.1109/ICMA57826.2023.10216024.
    [26] Q.-L. Cai, K.-W. Chen, C.-Y. Yao, and H.-K. Chu, “Automatic local point cloud registration algorithm and point cloud reconstruction system,” in Proc. 2021 Int. Symp. Intelligent Signal Processing and Communication Systems (ISPACS), Hualien City, Taiwan, 2021, pp. 1–2, doi: 10.1109/ISPACS51563.2021.9651113.
    [27] L. B. Soares, Á. A. Weis, B. de V. Guterres, R. N. Rodrigues, and S. S. da C. Botelho, "Computer vision system for weld bead geometric analysis," in Proc. 33rd Annu. ACM Symp. on Applied Computing (SAC '18), New York, NY, USA: ACM, 2018, pp. 292–299, doi: 10.1145/3167132.3167159.
    [28] M. Liu, X. Bai, S. Xi, H. Dong, R. Li, H. Zhang, and X. Zhou, "Detection and quantitative evaluation of surface defects in wire and arc additive manufacturing based on 3D point cloud," Virtual and Physical Prototyping, vol. 19, no. 1, 2023, doi: 10.1080/17452759.2023.2294336.
    [29] I. Diaz-Cano, A. Morgado-Estevez, J. M. Rodríguez Corral, P. Medina-Coello, B. Salvador-Dominguez, and M. Alvarez-Alcon, "Automated fillet weld inspection based on deep learning from 2D images," Applied Sciences, vol. 15, no. 2, p. 899, 2025, doi: 10.3390/app15020899.
    [30] W. Cai, L. Shu, S. Geng, Q. Zhou, and L. Cao, "Weld beads and defects automatic identification, localization, and size calculation based on a lightweight fully convolutional neural network," Optics & Laser Technology, vol. 170, p. 110266, 2024, doi: 10.1016/j.optlastec.2023.110266.
    [31] X. Jin, L. Lv, C. Chen, F. Yang, and T. Chen, "A new welding seam recognition methodology based on deep learning model MRCNN," in Proc. 2020 7th Int. Conf. on Information, Cybernetics, and Computational Social Systems (ICCSS), Guangzhou, China, 2020, pp. 767–771, doi: 10.1109/ICCSS52145.2020.9336927.
    [32] V. Pandiyan, P. Murugan, T. Tjahjowidodo, W. Caesarendra, O. M. Manyar, and D. J. H. Then, "In-process virtual verification of weld seam removal in robotic abrasive belt grinding process using deep learning," Robotics and Computer-Integrated Manufacturing, vol. 57, pp. 477–487, 2019, doi: 10.1016/j.rcim.2019.01.006.
    [33] H. Fronthaler, G. Croonen, J. Biber et al., "An online quality assessment framework for automated welding processes," International Journal of Advanced Manufacturing Technology, vol. 68, pp. 1655–1664, 2013, doi: 10.1007/s00170-013-4964-3.
    [34] Y. Yue, "Research on Welding Seam Detection and Recognition Technology for Industrial Boilers," in Proc. 2024 IEEE 7th Information Technology, Networking, Electronic and Automation Control Conference (ITNEC), Chongqing, China, 2024, pp. 413-416, doi: 10.1109/ITNEC60942.2024.10733094.
    [35] L. Yang, J. Fan, Y. Liu, E. Li, J. Peng and Z. Liang, "Automatic Detection and Location of Weld Beads With Deep Convolutional Neural Networks," IEEE Transactions on Instrumentation and Measurement, vol. 70, pp. 1-12, 2021, Art no. 5001912, doi: 10.1109/TIM.2020.3026514.
    [36] G. Ye, J. Guo, Z. Sun, C. Li and S. Zhong, "Weld bead recognition using laser vision with model-based classification," Robotics and Computer-Integrated Manufacturing, vol. 52, pp. 9-16, 2018, doi: 10.1016/j.rcim.2018.01.006.
    [37] K. Zhou, G. Ye, X. Gao, K. Zhong, J. Guo and B. Zhang, "Weld Bead Width and Height Measurement Using RANSAC," in Proc. 2019 4th International Conference on Control and Robotics Engineering (ICCRE), Nanjing, China, 2019, pp. 35-39, doi: 10.1109/ICCRE.2019.8724363.
    [38] H. Liu, T. Lan, T. Li, J. Ai, Y. Wang and Y. Sun, "Accurate backside boundary recognition of girth weld beads," Robotics and Computer-Integrated Manufacturing, vol. 92, p. 102880, 2025, doi: 10.1016/j.rcim.2024.102880.
    [39] H. H. Chu and Z. Y. Wang, "A vision-based system for post-welding quality measurement and defect detection," International Journal of Advanced Manufacturing Technology, vol. 86, pp. 3007–3014, 2016, doi: 10.1007/s00170-015-8334-1.
    [40] N. Jia, Z. Li, J. Ren, Y. Wang and L. Yang, "A 3D reconstruction method based on grid laser and gray scale photo for visual inspection of welds," Optics & Laser Technology, vol. 119, p. 105648, 2019, doi: 10.1016/j.optlastec.2019.105648.
    [41] A. Cao, Y. Zhu, Y. Liu, Q. Liu and X. He, "Weld Seam Identification and Calibration Method of Spiral Steel Pipe Grinding Robot," in Proc. 2022 5th World Conference on Mechanical Engineering and Intelligent Manufacturing (WCMEIM), Ma'anshan, China, 2022, pp. 927-932, doi: 10.1109/WCMEIM56910.2022.10021543.
    [42] C. Dong, T. Shi, Q. Zhao, X. Huang, and C. Liu, "1D segmentation network for 3D seam weld grinding," Journal of Physics: Conference Series, vol. 1924, no. 1, p. 012002, 2021, doi: 10.1088/1742-6596/1924/1/012002.
    [43] H. Wang, Z. Wu, Z. He, R. Gao, and H. Huang, "Detection of HF-ERW Process by 3D Bead Shape Measurement With Line-Structured Laser Vision," IEEE Sensors Journal, vol. 21, no. 6, pp. 7681-7690, 15 Mar. 2021, doi: 10.1109/JSEN.2021.3049396.
    [44] Y. Han, J. Fan, and X. Yang, "A structured light vision sensor for on-line weld bead measurement and weld quality inspection," International Journal of Advanced Manufacturing Technology, vol. 106, pp. 2065–2078, 2020, doi: 10.1007/s00170-019-04450-2.
    [45] J. Ge, Z. Deng, S. Wang et al., "Vision Sensing-Based Online Correction System for Robotic Weld Grinding," Chinese Journal of Mechanical Engineering, vol. 36, no. 1, p. 125, 2023, doi: 10.1186/s10033-023-00955-w.
    [46] H. Feng, X. Ren, L. Li et al., "A novel feature-guided trajectory generation method based on point cloud for robotic grinding of freeform welds," International Journal of Advanced Manufacturing Technology, vol. 115, pp. 1763–1781, 2021, doi: 10.1007/s00170-021-07095-2.
    [47] J. Ge, Z. Deng, Z. Li et al., "An efficient system based on model segmentation for weld seam grinding robot," International Journal of Advanced Manufacturing Technology, vol. 121, pp. 7627–7641, 2022, doi: 10.1007/s00170-022-09758-0.
    [48] J. Ou, L. Zou, Q. Wan, X. Liu, and Y. Li, "Weld-seam identification and model reconstruction of remanufacturing blade based on three-dimensional vision," Advanced Engineering Informatics, vol. 49, p. 101300, 2021, doi: 10.1016/j.aei.2021.101300.
    [49] W. Guo, Y. Zhu, and X. He, "A Robotic Grinding Motion Planning Methodology for a Novel Automatic Seam Bead Grinding Robot Manipulator," IEEE Access, vol. 8, pp. 75288-75302, 2020, doi: 10.1109/ACCESS.2020.2987807.
    [50] Y. Zhu, X. He, Q. Liu, and W. Guo, "Semiclosed-loop motion control with robust weld bead tracking for a spiral seam weld beads grinding robot," Robotics and Computer-Integrated Manufacturing, vol. 73, p. 102254, 2022, doi: 10.1016/j.rcim.2021.102254.
    [51] J. Ge, Z. Deng, Z. Li et al., "Robot welding seam online grinding system based on laser vision guidance," International Journal of Advanced Manufacturing Technology, vol. 116, pp. 1737–1749, 2021, doi: 10.1007/s00170-021-07433-4.
    [52] W. Wu, L. Kong, W. Liu and C. Zhang, "Laser Sensor Weld Beads Recognition and Reconstruction for Rail Weld Beads Grinding Robot," in Proc. 2017 5th International Conference on Mechanical, Automotive and Materials Engineering (CMAME), Guangzhou, China, 2017, pp. 143-148, doi: 10.1109/CMAME.2017.8540113.
    [53] W. Guo, X. Huang, B. Qi, X. Ren, H. Chen and X. Chen, "Vision-guided path planning and joint configuration optimization for robot grinding of spatial surface weld beads via point cloud," Advanced Engineering Informatics, vol. 61, p. 102465, 2024, doi: 10.1016/j.aei.2024.102465.
    [54] X. Wang, X. Zhang, X. Ren et al., "Point cloud 3D parent surface reconstruction and weld seam feature extraction for robotic grinding path planning," International Journal of Advanced Manufacturing Technology, vol. 107, pp. 827–841, 2020, doi: 10.1007/s00170-020-04947-1.
    [55] T. Wang, H. Cheng, X. Guan, and X. Gong, "Robotic Grinding Path Planning and Control for Weld Seam on Complex Welded Components," in Proc. 2024 IEEE 14th International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Copenhagen, Denmark, 2024, pp. 242-247, doi: 10.1109/CYBER63482.2024.10748682.
    [56] 張凱博 ”以分類演算法進行雷射線掃描之焊道辨識” 國立成功大學機械工程學系碩士論文, 2024
    [57] L. Liu, S. Yang, Q. Tang, and X. Tian, "A novel path planning method of robotic grinding for free-form weld seam based on 3D point cloud," The International Journal of Advanced Manufacturing Technology, vol. 131, pp. 1–22, 2024, doi: 10.1007/s00170-024-13247-x.
    [58] A. F. Mikkelstrup, M. Kristiansen, and E. Kristiansen, "Development of an automated system for adaptive post-weld treatment and quality inspection of linear welds," Int. J. Adv. Manuf. Technol., vol. 119, pp. 3675–3693, 2022, doi: 10.1007/s00170-021-08344-0.
    [59] M. Kazhdan, M. Bolitho, and H. Hoppe, "Poisson surface reconstruction," in Proc. 4th Eurographics Symp. Geometry Processing (SGP '06), Goslar, Germany, 2006, pp. 61–70.
    [60] M. Ester, H.-P. Kriegel, J. Sander, and X. Xu, "A density-based algorithm for discovering clusters in large spatial databases with noise," in Proc. 2nd Int. Conf. Knowledge Discovery and Data Mining (KDD '96), 1996, pp. 226–231.
    [61] C. R. Qi, H. Su, K. Mo, and L. J. Guibas, "PointNet: Deep learning on point sets for 3D classification and segmentation," in Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, Jul. 2017, pp. 652–660.
    [62] H. Guo, L. Du, and G. Li, "An investigation of geometric feature recognition in 3D ship data," International Journal of Naval Architecture and Ocean Engineering, vol. 16, p. 100597, 2024, doi: 10.1016/j.ijnaoe.2024.100597.
    [63] M. Martin-Abadal, M. Piñar-Molina, A. Martorell-Torres, G. Oliver-Codina, and Y. Gonzalez-Cid, "Underwater Pipe and Valve 3D Recognition Using Deep Learning Segmentation," Journal of Marine Science and Engineering, vol. 9, no. 1, p. 5, 2021, doi: 10.3390/jmse9010005.
    [64] H. Kim, J. Yoon, and S.-H. Sim, "Automated bridge component recognition from point clouds using deep learning,"Structural Control and Health Monitoring, vol. 27, no. 9, p. e2591, 2020, doi: 10.1002/stc.2591.
    [65] A. Jertec, D. Bojanić, K. Bartol, T. Pribanić, T. Petković, and S. Petrak, "On using PointNet Architecture for Human Body Segmentation," in Proc. 2019 11th International Symposium on Image and Signal Processing and Analysis (ISPA), Dubrovnik, Croatia, 2019, pp. 253-257, doi: 10.1109/ISPA.2019.8868844.
    [66] Y. Li, K. R. Allu, Z. Sun, A. Y. C. Tok, G. Feng, and S. G. Ritchie, "Truck body type classification using a deep representation learning ensemble on 3D point sets,"Transportation Research Part C: Emerging Technologies, vol. 133, p. 103461, 2021, doi: 10.1016/j.trc.2021.103461.
    [67] J. Zhang, X. Zhao, Z. Chen, and Z. Lu, "A Review of Deep Learning-Based Semantic Segmentation for Point Cloud," IEEE Access, vol. 7, pp. 179118-179133, 2019, doi: 10.1109/ACCESS.2019.2958671.
    [68] D. H. Douglas and T. K. Peucker, "Algorithms for the reduction of the number of points required to represent a digitized line or its caricature," Cartographica: The International Journal for Geographic Information and Geovisualization, vol. 10, no. 2, pp. 112-122, 1973.
    [69] K. S. Arun, T. S. Huang, and S. D. Blostein, "Least-Squares Fitting of Two 3-D Point Sets," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. PAMI-9, no. 5, pp. 698-700, Sept. 1987, doi: 10.1109/TPAMI.1987.4767965.
    [70] P. J. Besl and N. D. McKay, "Method for registration of 3-D shapes," in Sensor Fusion IV: Control Paradigms and Data Structures, vol. 1611, Apr. 1992, pp. 586-606.

    無法下載圖示 校內:2027-08-25公開
    校外:2027-08-25公開
    電子論文尚未授權公開,紙本請查館藏目錄
    QR CODE