簡易檢索 / 詳目顯示

研究生: 許博凱
Hsu, Bo-Kai
論文名稱: 遞增式單調性限制支援向量機之建構
The Development of Incremental Support Vector Machines with Monotonicity Constraints
指導教授: 李昇暾
Li, Sheng-Tun
學位類別: 碩士
Master
系所名稱: 管理學院 - 資訊管理研究所
Institute of Information Management
論文出版年: 2017
畢業學年度: 105
語文別: 英文
論文頁數: 53
中文關鍵詞: 支援向量機單調性先驗知識線上遞增方法遞增策略
外文關鍵詞: Support Vector Machines, Monotonic Prior knowledge, Online Incremental Method, Incremental Strategy
相關次數: 點閱:74下載:1
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 加以考量專家知識的單調性限制支援向量機在近年被提出,此種知識導向的技術已被證實能幫助支援向量機處理雜訊資料並得到更有用的結果。然而加入單調性限制後的支援向量機透過使用二次規劃的方法計算矩陣,在訓練大資料集時將產生相當大的運算成本。
    因此本研究利用遞增策略針對大資料集進行求解,將資料集視為串流資料,進而將資料點逐筆訓練。提出遞增式單調性限制支援向量機,針對大型及高複雜資料以遞增策略求解,增加支援向量機處理大量資料的可行性。
    本研究實驗結果發現,以遞增策略進行求解的單調性限制支援向量機擁有更快速的訓練時間,同時兼顧原本模型應有的正確率。在串流資料的實驗發現,因為所需較少訓練資料且不須重新訓練,遞增式單調性限制支援向量機在資料集逐漸增加時,擁有更佳訓練時間效益。

    RMC-SVM that incorporates experts' knowledge into SVMs based on the monotonic property of real-world problems has been proposed recently. This knowledge-oriented technique help SVMs to deal with noisy data and gain more useful results. However, solving SVMs with monotonicity constraints will require even more computational time than SVMs especially on large-scale datasets.
    Therefore, in this research, an incremental strategy is formulated to handle large datasets. We can treat large-scale dataset as streaming data, constructing the solution continuously, one point at one time. This study proposes incremental RMC-SVM to increase the feasibility of using RMC-SVMs in real world applications.
    Our experiments show the training time of the incremental RMC-SVM is less than the training time of RMC-SVM when both have the similarly classified results. In streaming case, OI-MCSVM has better performance when datasets growing up, since there is much less training set to be trained and no need to retrain the previous data points.

    摘要 I ABSTRACT II 誌謝 III CONTENTS IV List of Tables VI List of Figures VI Chapter 1 Introduction 1 1.1 Background and motivation 1 1.2 Research Objectives 4 1.3 Organization of Research 4 Chapter 2 Literature Review 7 2.1 Support Vector Machines (SVMs) 7 2.1.1 Evolution of SVMs 7 2.1.2 Applications of SVMs 9 2.2 Classification with Monotonicity Constraints 10 2.3 Online Incremental method with SVM 12 Chapter 3 Research Methodology 13 3.1 Concept of Monotonicity 13 3.1.1 Definition of Monotonicity 14 3.1.2 Constructing the Monotonicity Constraints 14 3.2 Construction of the Regularized Monotonic SVM Model 21 3.3 Online Incremental method 26 3.3.1 Learning Prototypes (LPs) 27 3.3.2 Learning Support Vectors (LSVs) 29 3.4 Online Incremental MCSVM 31 Chapter 4 Experimental Results and Analysis 34 4.1 Experiment steps 34 4.2 Data Collection 35 4.3 Performance Measures 37 4.4 Experimental Evaluation 39 Chapter 5 Conclusions and Suggestions 47 5.1 Contributions 47 5.2 Managerial Implications 48 5.3 Recommendations of Future Works 49 REFERENCES 50

    Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. In Proceedings of the 5th Annual Workshop on Computational Learning Theory, 144–152, ACM Press.
    Brookhouse, J., & Otero, F. E. (2016). Using an ant colony optimization algorithm for monotonic regression rule discovery. In Proceedings of the 2016 on Genetic and Evolutionary Computation Conference, 437-444, ACM Press.
    Burges, C. J. C. (1998). A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 2(2), 121-167.
    Cauwenberghs, G., & Poggio, T. (2001). Incremental and decremental support vector machine learning. Advances in Neural Information Processing Systems, 13, 409-415.
    Chang, C.-C., & Lin, C.-J. (2011). LIBSVM: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2(3), 27.
    Chen, C.-C., & Li, S.-T. (2014). Credit rating with a monotonicity-constrained support vector machine model. Expert Systems with Applications, 41(16), 7235-7247.
    Chung, Y.-S. (2013). Toward a monotonic fuzzy support vector machines model, unpublished master dissertation. National Cheng Kung University, Taiwan.
    Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine Learning, 20(3), 273-297.
    Decherchi, S., Ridella, S., Zunino, R., Gastaldo, P., & Anguita, D. (2010). Using unsupervised analysis to constrain generalization bounds for support vector classifiers. IEEE Transactions on Neural Networks, 21(3), 424-438.
    Dembczyński, K., Kotłowski, W., & Słowiński, R. (2008). Ensemble of decision rules for ordinal classification with monotonicity constraints. In International Conference on Rough Sets and Knowledge Technology, 260-267, Springer Berlin Heidelberg Press.
    Diehl, C. P., & Cauwenberghs, G. (2003). SVM incremental learning, adaptation and optimization. In Neural Networks, 2003. In Proceedings of the 2003 International Joint Conference on Neural Networks, 2685-2690, IEEE Press.
    Doumpos, M., & Zopounidis, C. (2009). Monotonic support vector machines for credit risk rating. New Mathematics and Natural Computation, 5(3), 557-570.
    García, J., AlBar, A. M., Aljohani, N. R., Cano, J.-R., & García, S. (2016). Hyperrectangles selection for monotonic classification by using evolutionary algorithms. International Journal of Computational Intelligence Systems, 9(1), 184-201.
    García, J., Cano, J.-R., & García, S. (2016). A nearest hyperrectangle monotonic learning method. In International Conference on Hybrid Artificial Intelligence Systems, 311-322, Springer Press.
    González, S., Herrera, F., & García, S. (2015). Managing monotonicity in classification by a pruned random forest. In International Conference on Intelligent Data Engineering and Automated Learning, 53-60, Springer Press.
    Greco, S., Matarazzo, B., & Slowinski, R. (1998). Operational Tools in the Management of Financial Risks (pp.121-136). Dordrecht: Springer Science & Business Media Press.
    Grinblat, G. L., Uzal, L. C., & Granitto, P. M. (2013). Abrupt change detection with one-class time-adaptive support vector machines. Expert Systems with Applications, 40(18), 7242-7249.
    Gruber, C., Gruber, T., Krinninger, S., & Sick, B. (2010). Online signature verification with support vector machines based on LCSS kernel functions. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 40(4), 1088-1100.
    Gutiérrez, P. A., & García, S. (2016). Current prospects on ordinal and monotonic classification. Progress in Artificial Intelligence, 5(3), 171-179.
    Hu, Q., Che, X., Zhang, L., Zhang, D., Guo, M., & Yu, D. (2012). Rank entropy-based decision trees for monotonic classification. IEEE Transactions on Knowledge and Data Engineering, 24(11), 2052-2064.
    Huang, Z., Chen, H., Hsu, C.-J., Chen, W.-H., & Wu, S. (2004). Credit rating analysis with support vector machines and neural networks: a market comparative study. Decision Support Systems, 37(4), 543-558.
    Karasuyama, M., & Takeuchi, I. (2009). Multiple incremental decremental learning of support vector machines. In Advances in Neural Information Processing Systems, 907-915, MIT Press.
    Kim, H. S., & Sohn, S. Y. (2010). Support vector machines for default prediction of SMEs based on technology credit. European Journal of Operational Research, 201(3), 838-846.
    Kramer, K., Hall, L. O., Goldgof, D. B., Remsen, A., & Luo, T. (2009). Fast support vector machines for continuous data. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 39(4), 989-1001.
    Langford, J., Li, L., & Zhang, T. (2009). Sparse online learning via truncated gradient. Journal of Machine Learning Research, 10(Mar), 777-801.
    Laskov, P., Gehl, C., Krüger, S., & Müller, K.-R. (2006). Incremental support vector learning: Analysis, implementation and applications. Journal of Machine Learning Research, 7(Sep), 1909-1936.
    Li, S.-T., & Chen, C.-C. (2014). A regularized monotonic fuzzy support vector machine model for data mining with prior knowledge. IEEE Transactions on Fuzzy Systems, 23(5), 1713-1727.
    Maes, C. M. (2010). A Regularized Active-set Method for Sparse Convex Quadratic Programming. Stanford, CA: Citeseer Press.
    Menon, A. K. (2009). Large-scale support vector machines: algorithms and theory. Research Exam, University of California, San Diego,117.
    Na, M. G., Park, W. S., & Lim, D. H. (2008). Detection and diagnostics of loss of coolant accidents using support vector machines. IEEE Transactions on Nuclear Science, 55(1), 628-636.
    Niyogi, P., Girosi, F., & Poggio, T. (1998). Incorporating prior information in machine learning by creating virtual examples. Proceedings of the IEEE, 86(11), 2196-2209.
    Pazzani, M. J., Mani, S., & Shankle, W. R. (2001). Acceptance of rules generated by machine learning among medical experts. Methods of Information in Medicine, 40(5), 380-385.
    Pei, S., Hu, Q., & Chen, C. (2016). Multivariate decision trees with monotonicity constraints. Knowledge-Based Systems, 112, 14-25.
    Platt, J. C. (1999). Fast training of support vector machines using sequential minimal optimization. In B. Schölkopf, C. J. C. Burges, & A. J. Smola, (Eds.), Advances in Kernel Methods (pp.185-208). Cambridge, MA: MIT Press.
    Potharst, R., & Feelders, A. J. (2002). Classification trees for problems with monotonicity constraints. ACM SIGKDD Explorations Newsletter, 4(1), 1-10.
    Ravikumar, B., Thukaram, D., & Khincha, H. (2009). An approach using support vector machines for distance relay coordination in transmission system. IEEE Transactions on Power Delivery, 24(1), 79-88.
    Schölkopf, B., Burgest, C., & Vapnik, V. (1995). Extracting support data for a given task. In Proceedings of the First International Conference on Knowledge Discovery and Data Mining, 252-257, AAAI Press.
    Shilton, A., Palaniswami, M., Ralph, D., & Tsoi, A. C. (2005). Incremental training of support vector machines. IEEE Transactions on Neural Networks, 16(1), 114-131.
    Shin, K.-S., Lee, T. S., & Kim, H.-j. (2005). An application of support vector machines in bankruptcy prediction model. Expert Systems with Applications, 28(1), 127-135.
    Tikhonov, A. N., & Arsenin, V. I. A. k. (1977). Solutions of ill-posed problems. Mathematics of Computation, 32(144), 1320-1322.
    Vapnik. (1995). The Nature of Statistical Learning Theory. New York: Springer-Verlag Press.
    Vapnik. (1998). Statistical Learning Theory. New York: Wiley Press.
    Yan, X., & Chowdhury, N. A. (2014). Mid-term electricity market clearing price forecasting utilizing hybrid support vector machine and auto-regressive moving average with external input. International Journal of Electrical Power & Energy Systems, 63, 64-70.
    Yang, S.-T. (2014). A Study of the Impact of Monotonicity Constraints on Monotonic Fuzzy Support Vector Machines. National Cheng Kung University, Taiwan.
    Yu, H.-F., Hsieh, C.-J., Chang, K.-W., & Lin, C.-J. (2012). Large linear classification when data cannot fit in memory. ACM Transactions on Knowledge Discovery from Data, 5(4), 23.
    Zheng, J., Shen, F., Fan, H., & Zhao, J. (2013). An online incremental learning support vector machine for large-scale data. Neural Computing and Applications, 22(5), 1023-1035.

    無法下載圖示 校內:2022-12-31公開
    校外:不公開
    電子論文尚未授權公開,紙本請查館藏目錄
    QR CODE