| 研究生: |
盧宣文 Lu, Hsuan-Wen |
|---|---|
| 論文名稱: |
發展核密度動態集成技術於預測保養 Kernel-Density Dynamic Ensemble Technique for Predictive Maintenance |
| 指導教授: |
李家岩
Lee, Chia-Yen |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 工程管理碩士在職專班 Engineering Management Graduate Program(on-the-job class) |
| 論文出版年: | 2020 |
| 畢業學年度: | 108 |
| 語文別: | 中文 |
| 論文頁數: | 48 |
| 中文關鍵詞: | 權重更新 、動態調整 、集成方法 、預測保養 、推斷信心指標 |
| 外文關鍵詞: | PdM, Weighted majority, Inference Confidence Index, Ensemble |
| 相關次數: | 點閱:98 下載:2 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
近年來,故障預測與健康管理(Prognostic and Health Management, PHM)被廣泛的應用在製造業中,其中一個重要的議題是預測保養(Predictive Maintenance, PdM),其目的是判斷設備或零件是否處於健康狀態,以便於及時保養維護。在過去許多研究中,指數模型常用於預測剩餘壽命(Remaining Useful Life, RUL),但由於指數模型計算方式限制,當老化特徵突然升高,模型可能無法及時反應而造成預測失準,也有學者提出以時間序列模型建構預測機制,但當部分觀測值缺失時,也會有預測錯誤問題,本研究提出動態調整權重機制與推斷信心指標,基於資料分析架構,萃取資料的重要統計特徵,利用機器學習不同演算法建構預測模型,針對不同預測模型能力,給予不同權重,並隨著時間軸移動進行權重動態調整,使預測結果更加穩健,於權重調整同時建立推斷信心指標(Inference Confidence Index, ICI),判斷模型預測相似度,搭配決策判斷模組,完善預測保養系統。
In recent years, prognostic and health management (PHM) has been widely used in manufacturing. One of the important issues is predictive maintenance (PdM). The purpose of PdM is to determine whether equipment or parts are in health.
In the past studies, the exponential models were often used to predict the remaining useful life (RUL). However, due to the limitation of the calculation method of the index model, when the aging characteristics suddenly rise, the model may fail to respond in time and cause the prediction error. The time series model is used to construct the prediction mechanism, but when some observations are missing, there will also be prediction errors. This study proposes a dynamic adjustment of the weight mechanism and model credit indicators. Based on the data analysis framework, important statistical characteristics of the data are extracted, and machine learning is used. Different algorithms construct prediction models, give different weights to different prediction model capabilities, and dynamically adjust the weights as the time axis moves to make the prediction results more robust. At the same time, the Inference Confidence Index(ICI) is established. The judgment model predicts the similarity, and with the decision judgment module, the prediction maintenance system is improved.
Bauer, E., & Kohavi, R. An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine learning, 36(1-2), 105-139, 1999.
Bonnett, A., and Young, C. Explaining motor failure. EC&M, Informa USA, 2004. http://www.ecmweb.com/content/explaining-motor-failure(available on Dec. 19, 2017)
Breiman, L. Bagging predictors. Machine learning, 24(2), 123-140, 1996.
Breiman, L. Arcing classifier (with discussion and a rejoinder by the author). The annals of statistics, 26(3), 801-849, 1998.
Breiman, L. Random forests. Machine learning, 45(1), 5-32, 2001.
Bushong, S. Company says $34 billion saved with its installed electric drives. Windpower Engineering and Development, 2016 WTWH Media, LLC, 2014. http://www.windpowerengineering.com/electrical/company-says-34-billion-saved-installed-electric-drives/
Chapman, P., Clinton, J., Kerber, R., Khabaza, T., Reinartz, T., Shearer, C., & Wirth, R. CRISP-DM 1.0: Step-by-step data mining guide. SPSS inc, 16, 2000.
Davies, A. Management guide to condition monitoring in manufacture: Institution of Production Engineers London, UK, 1990.
Drucker, H., Burges, C. J., Kaufman, L., Smola, A. J., & Vapnik, V. Support vector regression machines. In Advances in neural information processing systems (pp. 155-161), 1997.
Freund, Y., & Schapire, R. E. Experiments with a new boosting algorithm. Paper presented at the icml, 1996.
Friedman, J. H. Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189-1232, 2001.
Gama, J., Žliobaitė, I., Bifet, A., Pechenizkiy, M., & Bouchachia, A. A survey on concept drift adaptation. ACM computing surveys (CSUR), 46(4), 44, 2014.
Gebraeel, N. Sensory-updated residual life distributions for components with exponential degradation patterns. IEEE Transactions on Automation Science and Engineering, 3(4), 382-393, 2006.
Gebraeel, N. Prognostics-based identification of the top-$ k $ units in a fleet. IEEE Transactions on Automation Science and Engineering, 7(1), 37-48, 2009.
Gebraeel, N., Lawley, M., Liu, R., & Parmeshwaran, V. Life distributions from component degradation signals: A neural net approach. IEEE Trans. Ind. Electron, 51(3), 694-700, 2004.
Hastie, T., Tibshirani, R., & Friedman, J. The elements of statistical learning: data mining, inference, and prediction: Springer Science & Business Media, 2009.
Hung, S.-Y., Lee, C.-Y., & Lin, Y.-L. Data Science for Delamination Prognosis and Online Batch Learning in Semiconductor Assembly Process. IEEE Transactions on Components, Packaging and Manufacturing Technology, 2019.
Hyndman, R. J., & Koehler, A. B. Another look at measures of forecast accuracy. International journal of forecasting, 22(4), 679-688, 2006.
Jardine, A. K., Lin, D., & Banjevic, D. A review on machinery diagnostics and prognostics implementing condition-based maintenance. Mechanical systems and signal processing, 20(7), 1483-1510, 2006.
Kolter, J. Z., & Maloof, M. A. Using additive expert ensembles to cope with concept drift. Paper presented at the Proceedings of the 22nd international conference on Machine learning, 2005.
Kolter, J. Z., & Maloof, M. A. Dynamic weighted majority: An ensemble method for drifting concepts. Journal of Machine Learning Research, 8(Dec), 2755-2790, 2007.
Lee, C.-Y., & Dong, Z.-H. Hierarchical Equipment Health Index Framework. IEEE Transactions on Semiconductor Manufacturing, 32(3), 267-276, 2019.
Lee, C.-Y., Huang, T.-S., Liu, M.-K., & Lan, C.-Y. Data science for vibration heteroscedasticity and predictive maintenance of rotary bearings. Energies, 12(5), 801, 2019.
Lee, C.-Y., & Tsai, T.-L. Data science framework for variable selection, metrology prediction, and process control in TFT-LCD manufacturing. Robotics and Computer-Integrated Manufacturing, 55, 76-87, 2019.
Lee, J., Wu, F., Zhao, W., Ghaffari, M., Liao, L., & Siegel, D. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications. Mechanical systems and signal processing, 42(1-2), 314-334, 2014.
Littlestone, N., & Warmuth, M. K. The weighted majority algorithm. Information and computation, 108(2), 212-261, 1994.
Liu, K., Gebraeel, N. Z., & Shi, J. A data-level fusion model for developing composite health indices for degradation modeling and prognostic analysis. IEEE Transactions on Automation Science and Engineering, 10(3), 652-664, 2013.
Liu, K., & Huang, S. Integration of data fusion methodology and degradation modeling process to improve prognostics. IEEE Transactions on Automation Science and Engineering, 13(1), 344-354, 2014.
Makridakis, S. Accuracy measures: theoretical and practical concerns. International journal of forecasting, 9(4), 527-529, 1993.
Schlimmer, J. C., & Granger, R. H. Incremental learning from noisy data. Machine learning, 1(3), 317-354, 1986.
Shawe-Taylor, J., & Cristianini, N. Support vector machines. An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, 93-112, 2000.
Si, X.-S., Wang, W., Hu, C.-H., & Zhou, D.-H. Remaining useful life estimation–a review on the statistical data driven approaches. European journal of operational research, 213(1), 1-14, 2011.
Sotiris, V. A., Peter, W. T., & Pecht, M. G. Anomaly detection through a bayesian support vector machine. IEEE Transactions on Reliability, 59(2), 277-286, 2010.
Susto, G. A., Schirru, A., Pampuri, S., McLoone, S., & Beghi, A. Machine learning for predictive maintenance: A multiple classifier approach. IEEE Transactions on Industrial Informatics, 11(3), 812-820, 2014.
Sutharssan, T., Stoyanov, S., Bailey, C., & Yin, C. Prognostic and health management for engineering systems: a review of the data-driven approach and algorithms. The Journal of Engineering, 2015(7), 215-222, 2015.
Widmer, G., & Kubat, M. Learning in the presence of concept drift and hidden contexts. Machine learning, 23(1), 69-101, 1996.
Witten, I. H., Frank, E., Hall, M. A., & Pal, C. J. Data Mining: Practical machine learning tools and techniques: Morgan Kaufmann, 2016.
Wolpert, D. H. Stacked generalization. Neural networks, 5(2), 241-259, 1992.
Zhou, Z.-H. Ensemble methods: foundations and algorithms: Chapman and Hall/CRC, 2012.
Zhou, Z.-H., Wu, J., & Tang, W. Ensembling neural networks: many could be better than all. Artificial intelligence, 137(1-2), 239-263, 2002.
Žliobaitė, I., Pechenizkiy, M., & Gama, J. An overview of concept drift applications. In Big data analysis: new algorithms for a new society (pp. 91-114): Springer, 2016.