簡易檢索 / 詳目顯示

研究生: 辛政達
Hsin, Cheng-Ta
論文名稱: 單調性單類別極限學習機之研究
A Study of Monotonic One-class Extreme Learning Machine
指導教授: 李昇暾
Li, Sheng-Tun
學位類別: 碩士
Master
系所名稱: 管理學院 - 工業與資訊管理學系
Department of Industrial and Information Management
論文出版年: 2020
畢業學年度: 108
語文別: 英文
論文頁數: 44
中文關鍵詞: 單類別分類極限學習機單調性限制式
外文關鍵詞: One-class Classification, Extreme Learning Machine, Monotonicity Constraints
相關次數: 點閱:92下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 分類技術可以透過數據的特徵與相關性進行預測,因此在近年來得到ˋ越來越多的重視。其中,單類別分類通常使用於異常質檢測或是正負類別比例不平衡的數據。單調性分類問題則表示輸入的數據與某些特徵或決策標籤之間存在著單調性關係。而極限學習機(ELM)是分類問題中最常被使用的神經網路模型之一,透過隨機決定權重的特性,ELM能夠大大的減少計算時間,但是由於存在訓練誤差的問題,因此ELM並不是處理單調性分類問題的最佳模型。在過往的研究中,通常是透過增加單調性限制式來處理單調性分類問題,然而這樣的方式往往會增加運算的複雜度。有鑑於此,本研究希望能透過調整ELM的網路權重結構來滿足單調性限制。
    本研究透過改變ELM網路權重的表達方式來滿足單調性限制,進而使ELM能夠妥善的處理單調性數據,此外我們生成一個決策函數來進行單類別的分類。透過實驗,可以發現我們所提出的模型在處理單調性數據分類問題上具有良好的穩定性。

    Classification technology has become very popular since it can help make predictions through the characteristics and relevance of data. Among them, one-class classification is often used for outlier detection or for the data that is imbalance between positive and negative classes. Monotonic classification problems mean that there exists monotonicity relation between some features of input data and the decision label. Extreme learning machine (ELM) is one of the most commonly used network models in the classification problem. By the characteristic of randomly determining the weight, ELM can greatly reduce the computation time. However, because of the existence of training error, ELM is not a good model for monotonic classification problems. In the past, the monotonic classification was often carried out by adding monotonic constraints, but this would increase the complexity of the operation. Therefore, this study hopes to meets the monotonic constraints by adjust the network weight structure of ELM.
    This study satisfied the monotonicity constraints by altering the expression of the network weights, so that ELM can handle the monotonic data. Furthermore, we generate a decision function to handle the problem of one-class classification. Through experiments, we find that the model we proposed has good stability in dealing with monotonic data classification problems.

    摘要 I Abstract II 誌 謝 III Contents V List of Table VII List of Figures VII Chapter 1 Introduction 1 1.1 Background and Motivation 1 1.2 Research Objectives 3 1.3 Research Process 3 Chapter 2 Literature Review 5 2.1 Extreme Learning Machine 5 2.1.1 Introduction of ELM 5 2.1.2 Mathematical of ELM 6 2.1.3 Application of ELM 8 2.2 One-class Classification 9 2.3 Monotonic Classification 11 2.4 Definition of Monotonic Classification 12 Chapter 3 Research Methodology 14 3.1 Problem Definition 14 3.2 Model Construction 14 3.2.1 Monotonicity Constraints 15 3.2.2 Decision Function of One-class ELM 16 3.3 Model of Monotonic One-class ELM 18 Chapter 4 Experimental Results and Analysis 20 4.1 Experiment Design 20 4.2 Experimental Datasets 21 4.3 Predictive Evaluation Indicators 23 4.4 Model Performance 25 Chapter 5 Conclusion and Future Work 39 5.1 Conclusion and contributions 39 5.2 Recommendations of Future Works 40 References 42

    Bartlett, P. L. (1998). The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE transactions on Information Theory, 44(2), 525-536.
    Ben‐David, A., Sterling, L., & Pao, Y. H. (1989). Learning and classification of monotonic ordinal concepts. Computational Intelligence, 5(1), 45-49.
    Bishop, C. M. (1994). Novelty detection and neural network validation. IEE Proceedings-Vision, Image and Signal processing, 141(4), 217-222.
    Bonchi, F., Giannotti, F., Mazzanti, A., & Pedreschi, D. (2003). ExAMiner: Optimized level-wise frequent pattern mining with monotone constraints. Paper presented at the Third IEEE International Conference on Data Mining.
    Breiman, L., Friedman, J., Olshen, R., & Stone, C. (1984). Classification and Regression Trees.
    Cano, J.-R., Gutiérrez, P. A., Krawczyk, B., Woźniak, M., & García, S. (2019). Monotonic classification: An overview on algorithms, performance measures and data sets. Neurocomputing, 341, 168-182.
    Cao-Van, K. (2003). Supervised ranking: from semantics to algorithms. Ghent University,
    Chen, C.-C., & Li, S.-T. (2014). Credit rating with a monotonicity-constrained support vector machine model. Expert Systems with Applications, 41(16), 7235-7247.
    Daniels, H., & Velikova, M. (2010). Monotone and partially monotone neural networks. IEEE Transactions on neural networks, 21(6), 906-917.
    Daniels, H. A., & Velikova, M. V. (2006). Derivation of monotone decision models from noisy data. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 36(5), 705-710.
    Forsyth, P., & Labahn, G. (2019). ε-monotone Fourier methods for optimal stochastic control in finance. Journal of Computational Finance, 22(4).
    García‐Zattera, M. J., Mutsvari, T., Jara, A., Declerck, D., & Lesaffre, E. (2010). Correcting for misclassification for a monotone disease process with an application in dental research. Statistics in medicine, 29(30), 3103-3117.
    Gelfand, A. E., & Mallick, B. K. (1995). Bayesian analysis of proportional hazards models built from monotone functions. Biometrics, 843-852.
    Gutiérrez, P. A., & García, S. (2016). Current prospects on ordinal and monotonic classification. Progress in Artificial Intelligence, 5(3), 171-179.
    Hu, Q., Che, X., Zhang, L., Zhang, D., Guo, M., & Yu, D. (2012). Rank Entropy-Based Decision Trees for Monotonic Classification. IEEE Transactions on Knowledge and Data Engineering, 11(24), 2052-2064.
    Huang, G.-B., & Chen, L. (2007). Convex incremental extreme learning machine. Neurocomputing, 70(16-18), 3056-3062.
    Huang, G.-B., Chen, L., & Siew, C. K. (2006). Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Networks, 17(4), 879-892.
    Huang, G.-B., Wang, D. H., & Lan, Y. (2011). Extreme learning machines: a survey. International journal of machine learning and cybernetics, 2(2), 107-122.
    Huang, G.-B., Zhou, H., Ding, X., & Zhang, R. (2011). Extreme learning machine for regression and multiclass classification. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 42(2), 513-529.
    Huang, G.-B., Zhu, Q.-Y., & Siew, C.-K. (2004). Extreme learning machine: a new learning scheme of feedforward neural networks. Neural networks, 2, 985-990.
    Huang, G.-B., Zhu, Q.-Y., & Siew, C.-K. (2006). Extreme learning machine: theory and applications. Neurocomputing, 70(1-3), 489-501.
    Khan, S. S., & Madden, M. G. (2009). A survey of recent trends in one class classification. Paper presented at the Irish conference on artificial intelligence and cognitive science.
    Khan, S. S., & Madden, M. G. (2014). One-class classification: taxonomy of study and review of techniques. The Knowledge Engineering Review, 29(3), 345-374.
    Lang, B. (2005). Monotonic multi-layer perceptron networks as universal approximators. Paper presented at the International conference on artificial neural networks.
    Leng, Q., Qi, H., Miao, J., Zhu, W., & Su, G. (2015). One-class classification with extreme learning machine. Mathematical problems in engineering, 2015.
    Li, K.-L., Huang, H.-K., Tian, S.-F., & Xu, W. (2003). Improving one-class SVM for anomaly detection. Paper presented at the Proceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No. 03EX693).
    Liang, N.-Y., Huang, G.-B., Saratchandran, P., & Sundararajan, N. (2006). A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Transactions on neural networks, 17(6), 1411-1423.
    Parra, L., Deco, G., & Miesbach, S. (1996). Statistical independence and novelty detection with information preserving nonlinear maps. Neural Computation, 8(2), 260-269.
    Perdisci, R., Gu, G., & Lee, W. (2006). Using an Ensemble of One-Class SVM Classifiers to Harden Payload-based Anomaly Detection Systems. Paper presented at the ICDM.
    Shin, H. J., Eom, D.-H., & Kim, S.-S. (2005). One-class support vector machines—an application in machine fault detection and classification. Computers & Industrial Engineering, 48(2), 395-408.
    Sill, J. (1998). Monotonic networks. Paper presented at the Advances in neural information processing systems.
    Tax, D. M., & Duin, R. P. (1999). Support vector domain description. Pattern Recognition Letters, 20(11-13), 1191-1199.
    Tax, D. M., & Duin, R. P. (2001). Uniform object generation for optimizing one-class classifiers. Journal of machine learning research, 2(Dec), 155-173.
    Tax, D. M. J. (2002). One-class classification: Concept learning in the absence of counter-examples.
    Tissera, M. D., & McDonnell, M. D. (2016). Deep extreme learning machines: supervised autoencoding architecture for classification. Neurocomputing, 174, 42-49.
    Wang, Q., Lopes, L. S., & Tax, D. M. (2004). Visual object recognition through one-class learning. Paper presented at the International Conference Image Analysis and Recognition.
    Xia, F., Zhang, W., Li, F., & Yang, Y. (2008). Ranking with decision tree. Knowledge and information systems, 17(3), 381-395.
    Zhang, H., & Zhang, Z. (1999). Feedforward networks with monotone constraints. Paper presented at the IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No. 99CH36339).
    Zhu, H., Tsang, E. C., Wang, X.-Z., & Ashfaq, R. A. R. (2017). Monotonic classification extreme learning machine. Neurocomputing, 225, 205-213.

    無法下載圖示 校內:2025-06-16公開
    校外:不公開
    電子論文尚未授權公開,紙本請查館藏目錄
    QR CODE