| 研究生: |
陳惠昭 Chen, Hui-Chao |
|---|---|
| 論文名稱: |
使用整體趨勢擴展技術提升多模式整合法之預測準確率 Using Mega-Trend-Diffusion Technique to Improve the Forecasting Accuracy of Meta-Model Method |
| 指導教授: |
利德江
Li, Der-Chiang |
| 學位類別: |
碩士 Master |
| 系所名稱: |
管理學院 - 工業與資訊管理學系 Department of Industrial and Information Management |
| 論文出版年: | 2014 |
| 畢業學年度: | 102 |
| 語文別: | 中文 |
| 論文頁數: | 63 |
| 中文關鍵詞: | 多模式整合法 、預測數值整合法 、整體趨勢擴展技術 |
| 外文關鍵詞: | mega-trend-diffusion, numerical prediction ensemble method, meta-model ensemble method |
| 相關次數: | 點閱:71 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
傳統的模式辨別系統係使用單一演算法進行資料的學習,但此並無法滿足不同行為模式者,故於過去數十年間機械學習法之開發有蓬勃之發展。近十年來,所開發之單一學習法已屆瓶頸卻仍未見可滿足不同行為特性之資料者,因此許多學者嘗試結合多種學習法以求進階之效果改善,稱之為多模整合法。然此些整合法之研究主要琢磨於模式整合過程,對於如何整合其結果之研究相對甚少,如分類預測問題藉由投票多數決方式整合,但數值預測問題方面,則大多如Yu (2011)採簡單平均法為之。因此本研究亦基於整體趨勢擴展(mega-trend-diffusion, MTD)技術提出一個小樣本資料學習下,具合理性之預測數值產生流程,求算各模式於追求最小誤差目標下之預測值,以取得多模系統之穩健性,其中本研究所整合者包含線性迴歸、倒傳遞類神經網路、支援向量迴歸以及M5’模式樹四種預測模式。在方法的效果驗證上,本研究將與簡單平均法以及各模式的預測值進行比較,同時以三個個案進行驗證。實驗結果發現,本研究所提出的方法確實能降低預測誤差。
Single learning algorithm has been developed in the past few decades, but it can not satisfied different kinds of data recently. There are many researches try to combine a variety of learning algorithms to improve learning accuracy called multi-model ensemble method, however these researches focus on its process instead of results. This paper presents an approach that is based on mega-trend-diffusion (MTD) and rational method of numerical prediction to minimize calculation error by compromising solution of models. The used methods include multiple regression, back-propagation network, support vector regression, and M5' model tree. The proposed approach would compare with the simple average method and these models, while three cases study are used to illustrate the details of this research. The empirical results show that the proposed method can reduce models mean absolute percentage error.
葉怡成. (2003). 類神經網路模式應用與實作.
Acharya, N., U. C. Mohanty and L. Sahoo (2013). "Probabilistic multi-model ensemble prediction of Indian summer monsoon rainfall using general circulation models: A non-parametric approach." Comptes Rendus Geoscience 345(3): 126-135.
Breiman, L. (1996). "Bagging predictors." Machine Learning 24(2): 123-140.
Breiman, L., J. H. Friedman, R. A. Olshen and C. J. Stone (1984). Classification and Regression Trees. CA:Wadsworth International Group. Belmont, Springer Berlin Heidelberg. 1910: 54-64.
Bryll, R., R. Gutierrez-Osuna and F. Quek (2003). "Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets." Pattern Recognition 36(6): 1291-1302.
Byon, E., A. K. Shrivastava and Y. Ding (2010). "A classification procedure for highly imbalanced class sizes." IIE Transactions 42(4): 288-303.
Chikamoto, Y., M. Kimoto, M. Ishii, T. Mochizuki, T. T. Sakamoto, H. Tatebe, Y. Komuro, M. Watanabe, T. Nozawa, H. Shiogama, M. Mori, S. Yasunaka and Y. Imada (2013). "An overview of decadal climate predictability in a multi-model ensemble by climate model MIROC." Climate Dynamics 40(5-6): 1201-1222.
Cortes, C. and V. Vapnik (1995). "Support-vector networks." Mach. Learn. 20(3): 273-297.
Dietterich, T. G. (2000). Ensemble Methods in Machine Learning. Proceedings of the First International Workshop on Multiple Classifier Systems, Springer-Verlag: 1-15.
Drucker, H., C. J. Burges, L. Kaufman, A. Smola and V. Vapnik (1997). "Support vector regression machines." Advances in neural information processing systems: 155-161.
Efron, B. and R. J. Tibshirani (1993). An Introduction to the Bootstrap, New York: Chapmen & Hall.
Huang, C. (1997). "Principle of information diffusion." Fuzzy Sets and Systems 91(1): 69-90.
Huang, C. and C. Moraga (2004). "A diffusion-neural-network for learning from small samples." International Journal of Approximate Reasoning 35(2): 137-161.
Jang, J. S. R. (1993). "ANFIS: adaptive-network-based fuzzy inference system." IEEE Transactions on Systems, Man and Cybernetics, 23(3): 665-685.
Jha, A., R. Chauhan, M. Mehra, H. R. Singh and R. Shankar (2012). "miR-BAG: bagging based identification of microRNA precursors." PLoS One 7(9): e45782.
Kuncheva, L. I. (2004). Combining Pattern Classifiers: Methods and Algorithms. Hoboken, NJ: Wiley.
Li, D.-C., C.-J. Chang, C.-C. Chen and W.-C. Chen (2012a). "A grey-based fitting coefficient to build a hybrid forecasting model for small data sets." Applied Mathematical Modelling 36(10): 5101-5108.
Li, D.-C., C.-W. Liu and W.-C. Chen (2012b). "A multi-model approach to determine early manufacturing parameters for small-data-set prediction." International Journal of Production Research 50(23): 6679-6690.
Li, D.-C., C.-S. Wu, T.-I. Tsai and F. M. Chang (2006). "Using mega-fuzzification and data trend estimation in small data set learning for early FMS scheduling knowledge." Computers & Operations Research 33(6): 1857-1869.
Li, D.-C., C.-S. Wu, T.-I. Tsai and Y.-S. Lina (2007). "Using mega-trend-diffusion and artificial samples in small data set learning for early flexible manufacturing system scheduling knowledge." Computers & Operations Research 34(4): 966-982.
Li, D.-C., C. Wu and F. M. Chang (2005). "Using data-fuzzification technology in small data set learning to improve FMS scheduling accuracy." The International Journal of Advanced Manufacturing Technology 27(3-4): 321-328.
Osawa, T., H. Mitsuhashi, Y. Uematsu and A. Ushimaru (2011). "Bagging GLM: Improved generalized linear model for the analysis of zero-inflated data." Ecological Informatics 6(5): 270-275.
Quinlan, J. R. (1992). Learning with Continuous Classes. Proceedings of the 5th Australian Joint Conference on Artificial Intelligence (1992), pp. 343–348
Reformat, M. and R. Yager (2008). "Building ensemble classifiers using belief functions and OWA operators." Soft Computing 12(6): 543-558.
Roiger, R. and M. Geatz (2003). Data mining: A tutorial-based primer, Addison Wesley New York.
Sánchez A, V. D. (2003). "Advanced support vector machines and kernel methods." Neurocomputing 55(1–2): 5-20.
Todorovski, L. and Džeroski, S. (2000). Combining Multiple Models with Meta Decision Trees. Machine Learning, 50 (3), 223–249.
Wang, Y. and I. H. Witten (1997). Inducing model trees for continuous classes. In Proceedings of Poster Papers, Ninth European Conference on Machine
Learning, 1997.
Yu, Q. (2011). "Weighted bagging: a modification of AdaBoost from the perspective of importance sampling." Journal of Applied Statistics 38(3): 451-463.
Yunqian, M. and V. Cherkassky (2003). Multiple model classification using SVM-based approach. The International Joint Conference on Neural Network, 4, 1581–1586.
校內:2024-12-31公開