簡易檢索 / 詳目顯示

研究生: 莊惠棋
Chuang, Hui-Chi
論文名稱: 具單調性先驗知識之支持向量學習法:迴歸建模與高效可擴展分類研究
Monotonic Prior Knowledge in Support Vector Learning: Modeling Regression and Efficient, Scalable Classification
指導教授: 李昇暾
Li, Sheng-Tun
學位類別: 博士
Doctor
系所名稱: 管理學院 - 資訊管理研究所
Institute of Information Management
論文出版年: 2025
畢業學年度: 113
語文別: 英文
論文頁數: 92
中文關鍵詞: 先備知識單調性支持向量機支持向量迴歸機器學習
外文關鍵詞: Prior knowledge, Monotonicity constraints, Support vector machine, Support vector regression, Machine learning
相關次數: 點閱:12下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在當代機器學習與資料探勘的應用場景中,如何有效融合領域先驗知識以提升模型的可解釋性與實務適用性,已成為人工智慧研究的重要課題。單調性為一種常見且具代表性的先驗結構,描述輸入與輸出變數之間一致的變化趨勢,廣泛存在於如信用評分、醫療診斷與風險預測等應用領域。將此類結構性知識納入模型訓練,有助於提升預測準確度,確保結果合乎專家邏輯,進而強化模型在實務決策中的可採信度。
    本研究旨在發展一套理論嚴謹且具可擴展性的支援向量學習架構,以系統性整合單調性先驗知識。首先提出正則化單調性約束支援向量迴歸模型(RMC-SVR),透過將偏序不等式形式的單調性條件嵌入 SVR 並結合 Tikhonov 正則化,確保模型在保持單調一致性的同時具備凸性與解的唯一性。為因應高維與大規模資料處理需求,進一步提出平行化箱形約束共軛梯度支援向量機模型(PBCCG-RMC-SVM),該方法採用可平行化的優化策略,以替代傳統二次規劃,顯著降低訓練時間,並維持分類效能與單調性遵循能力。
    實驗設計涵蓋多組合成與真實世界資料集,驗證結果顯示所提模型在預測表現、單調性維持度與計算效率方面均優於傳統支援向量方法,展現良好的泛化能力與應用潛力。整體而言,本論文所提出之方法不僅促進知識導向學習在非線性與高維資料情境中的發展,亦為建構兼具理論健全性與實務可行性的單調性學習模型提供堅實基礎,對智慧決策與自動化分析系統具高度貢獻價值。

    In modern machine learning, embedding domain-specific prior knowledge is essential for improving both interpretability and practical relevance. A particularly valuable form of such knowledge is monotonicity, where input-output relationships follow consistent directional trends. This property is commonly observed in real-world applications such as credit scoring and medical diagnostics.
    This dissertation explores systematic integration of monotonic prior knowledge into support vector learning frameworks, bridging the gap between statistical learning and expert reasoning. Two novel models are introduced. First, the Regularized Monotonicity-Constrained Support Vector Regression (RMC-SVR) incorporates monotonicity constraints as partial-order inequalities within the SVR structure and applies Tikhonov regularization to ensure convexity and solution uniqueness. Second, to address scalability in large-scale datasets, the Parallelized Box-Constrained Conjugate Gradient RMC-SVM (PBCCG-RMC-SVM) is proposed. This algorithm adopts a parallelizable optimization strategy to enhance computational efficiency without sacrificing predictive performance.
    Extensive experiments on synthetic and real-world datasets validate the proposed models, demonstrating improved prediction accuracy, stronger monotonicity adherence, and greater computational efficiency compared to conventional approaches. Overall, this study advances knowledge-integrated learning by offering a theoretically sound and practically scalable approach to high-dimensional, nonlinear, and data-intensive modeling.

    摘要 I Abstract II 誌謝 III Content IV List of Tables VI List of Figures VII List of Notations VIII Chapter 1 Introduction 1 1.1 Background and Motivation 1 1.2 Research Objectives 4 1.3 Research Organization 8 Chapter 2 Literature Review 9 2.1 Support Vector Machine and Support Vector Regression 9 2.2 Knowledge-Guided Machine Learning with Monotonic SVMs 10 2.2.1 Data Preprocessing-Based Approach 11 2.2.2 Model-Based Approach 12 2.2.3 Support Vector-Based Approach 14 2.3 Parallel Strategy of MC-SVM 15 2.4 Chapter Summary 16 Chapter 3 Monotonic Regression Modelling: The RMC-SVR Framework 18 3.1 Definition of Monotonicity 18 3.2 The Construction of Monotonicity Constraints 21 3.3 Derivation and Solution of the Dual Problem in RMC-SVR 22 3.4 Chapter Summary 31 Chapter 4 Scalable Monotonic Classification 32 4.1 Derivation of the RMC-SVM Model 32 4.2 Parallelized Box-Constrained CG Algorithm 36 4.3 Chapter Summary 41 Chapter 5 Experimental Results and Analysis 42 5.1 Experiment Design 42 5.2 Performance Evaluation 43 5.3 Experimental Analysis for RMC-SVR 45 5.3.1 Synthetic Datasets 46 5.3.2 Real-World Datasets 52 5.3.3 Summary of Experimental Analysis 54 5.4 Experimental Analysis for PBCCG-RMC-SVM 55 5.4.1 WDBC 56 5.4.2 Spam email 59 5.4.3 PD 600 62 5.4.4 Summary of Experimental Analysis 66 5.5 Chapter Summary 66 Chapter 6 Conclusion and Future Work 68 6.1 Research Contributions 68 6.2 Managerial Implications 69 6.3 Limitations and Future Work 71 References 75

    Abdel-Jaber, H., Devassy, D., Al Salam, A., Hidaytallah, L., & El-Amir, M. (2022). A review of deep learning algorithms and their applications in healthcare. Algorithms, 15(2), 71.
    Abdullah, D. M., & Abdulazeez, A. M. (2021). Machine learning applications based on SVM classification a review. Qubahan Academic Journal, 1(2), 81-90. https://doi.org/10.48161/qaj.v1n2a50
    Abu-Mostafa, Y. S. (1994). Learning from hints. Journal of Complexity, 10(1), 165-178.
    Abu-Mostafa, Y. S. (1995). Hints. Neural computation, 7(4), 639-671.
    Adinyira, E., Adjei, E. A.-G., Agyekum, K., & Fugar, F. D. K. (2021). Application of machine learning in predicting construction project profit in Ghana using Support Vector Regression Algorithm (SVRA). Engineering, Construction and Architectural Management, 28(5), 1491-1514.
    Altan, A., & Karasu, S. (2019). The effect of kernel values in support vector machine to forecasting performance of financial time series. The Journal of Cognitive Systems, 4(1), 17-21.
    Amini, K., & Faramarzi, P. (2023). Global convergence of a modified spectral three-term CG algorithm for nonconvex unconstrained optimization problems. Journal of Computational and Applied Mathematics, 417, 114630.
    Awad, M., Khanna, R., Awad, M., & Khanna, R. (2015). Support vector regression. Efficient learning machines: Theories, concepts, and applications for engineers and system designers, 67-80.
    Bansal, M., Goyal, A., & Choudhary, A. (2022). A comparative analysis of K-nearest neighbor, genetic, support vector machine, decision tree, and long short term memory algorithms in machine learning. Decision Analytics Journal, 3, 100071.
    Basak, D., Pal, S., & Patranabis, D. C. (2007). Support vector regression. Neural Information Processing-Letters and Reviews, 11(10), 203-224.
    Bertsekas, D., Castañon, D., Eckstein, J., & Zenios, S. (1995). Parallel computing in network optimization. Handbooks in Operations Research and Management Science, 7, 331-399.
    Blake, C. L. (1998). UCI repository of machine learning databases. http://www. ics. uci. edu/~ mlearn/MLRepository. html.
    Burges, C. J. (1998). A tutorial on support vector machines for pattern recognition. Data mining and knowledge discovery, 2(2), 121-167. https://doi.org/10.1023/A:1009715923555
    Cano, J.-R., Gutiérrez, P. A., Krawczyk, B., Woźniak, M., & García, S. (2019). Monotonic classification: An overview on algorithms, performance measures and data sets. Neurocomputing, 341, 168-182. https://doi.org/10.1016/j.neucom.2019.02.024
    Cervantes, J., Garcia-Lamont, F., Rodríguez-Mazahua, L., & Lopez, A. (2020). A comprehensive survey on support vector machine classification: Applications, challenges and trends. Neurocomputing, 408, 189-215. https://doi.org/10.1016/j.neucom.2019.10.118
    Chakraborty, S., Chattopadhyay, P. P., Ghosh, S. K., & Datta, S. (2017). Incorporation of prior knowledge in neural network model for continuous cooling of steel using genetic algorithm. Applied Soft Computing, 58, 297-306. https://doi.org/10.1016/j.asoc.2017.05.001
    Chandra, M. A., & Bedi, S. (2021). Survey on SVM and their application in image classification. International Journal of Information Technology, 13(5), 1-11.
    Chen, C.-C., Chuang, H.-C., Cheng, Y.-C., & Li, S.-T. (2016). Constraint Selection on monotonic support vector classifiers. 2016 5th IIAI International Congress on Advanced Applied Informatics (IIAI-AAI),
    Chen, C.-C., & Li, S.-T. (2014). Credit rating with a monotonicity-constrained support vector machine model. Expert Systems with Applications, 41(16), 7235-7247. https://doi.org/10.1016/j.eswa.2014.05.035
    Chen, H., Yu, Y., Jia, Y., & Gu, B. (2023). Incremental learning for transductive support vector machine. Pattern Recognition, 133, 108982. https://doi.org/10.1016/j.patcog.2022.108982
    Chen, Y., Xiong, J., Xu, W., & Zuo, J. (2019). A novel online incremental and decremental learning algorithm based on variable support vector machine. Cluster Computing, 22, 7435-7445. https://doi.org/10.1007/s10586-018-1772-4
    Chou, J. S., & Pham, A. D. (2015). Smart artificial firefly colony algorithm‐based support vector regression for enhanced forecasting in civil engineering. Computer‐Aided Civil and Infrastructure Engineering, 30(9), 715-732. https://doi.org/10.1111/mice.12121
    Christen, P., Hand, D. J., & Kirielle, N. (2023). A review of the F-measure: its history, properties, criticism, and alternatives. ACM Computing Surveys, 56(3), 1-24.
    Chuang, H.-C., Chen, C.-C., & Li, S.-T. (2020). Incorporating monotonic domain knowledge in support vector learning for data mining regression problems. Neural Computing and Applications, 32, 11791-11805. https://doi.org/10.1007/s00521-019-04661-4
    Chuang, H.-C., Chen, C.-C., & Li, S.-T. (2024). Advancing SVM classification: Parallelizing conjugate gradient for monotonicity enforcement. Knowledge-Based Systems, 302, 112388. https://doi.org/10.1016/j.knosys.2024.112388
    Collobert, R., Bengio, S., & Bengio, Y. (2002). A parallel mixture of SVMs for very large scale problems. Neural computation, 14(5), 1105-1114.
    Cortes, C. (1995). Support-Vector Networks. Machine learning. https://doi.org/10.1007/BF00994018
    Dai, Y.-H., & Fletcher, R. (2006). New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds. Mathematical programming, 106(3), 403-421. https://doi.org/10.1007/s10107-005-0595-2
    Daniels, H. A., & Velikova, M. V. (2006). Derivation of monotone decision models from noisy data. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 36(5), 705-710.
    Dash, T., Chitlangia, S., Ahuja, A., & Srinivasan, A. (2022). A review of some techniques for inclusion of domain-knowledge into deep neural networks. Scientific Reports, 12(1), 1040. https://doi.org/10.1038/s41598-021-04590-0
    Dong, J.-x., Krzyżak, A., & Suen, C. Y. (2003). A fast parallel optimization for training support vector machine. International Workshop on Machine Learning and Data Mining in Pattern Recognition,
    Doumpos, M., & Zopounidis, C. (2009). Monotonic support vector machines for credit risk rating. New Mathematics and Natural Computation, 5(03), 557-570. https://doi.org/10.1142/S1793005709001520
    Duivesteijn, W., & Feelders, A. (2008). Nearest neighbour classification with monotonicity constraints. Joint European conference on machine learning and knowledge discovery in databases,
    Friedlander, A., & Martínez, J. M. (1989). On the numerical solution of bound constrained optimization problems. RAIRO-Operations Research, 23(4), 319-341.
    Friedlander, A., & Martínez, J. M. (1994). On the maximization of a concave quadratic function with box constraints. SIAM Journal on Optimization, 4(1), 177-192. https://doi.org/10.1137/0804010
    Galántai, A. (2000). The theory of Newton's method. Journal of Computational and Applied Mathematics, 124(1-2), 25-44. https://doi.org/10.1016/S0377-0427(00)00435-0
    Golub, G. H., Hansen, P. C., & O'Leary, D. P. (1999). Tikhonov regularization and total least squares. SIAM journal on matrix analysis and applications, 21(1), 185-194.
    González, S., García, S., Li, S.-T., John, R., & Herrera, F. (2021). Fuzzy k-nearest neighbors with monotonicity constraints: Moving towards the robustness of monotonic noise. Neurocomputing, 439, 106-121. https://doi.org/10.1016/j.neucom.2019.12.152
    González, S., Herrera, F., & García, S. (2015). Monotonic random forest with an ensemble pruning mechanism based on the degree of monotonicity. New Generation Computing, 33, 367-388.
    Graf, H. P., Cosatto, E., Bottou, L., Dourdanovic, I., & Vapnik, V. (2004). Parallel support vector machines: The cascade SVM. Advances in neural information processing systems, 17, 521-528.
    Grossi, V., Romei, A., & Turini, F. (2017). Survey on using constraints in data mining. Data mining and knowledge discovery, 31(2), 424-464.
    Guido, R., Ferrisi, S., Lofaro, D., & Conforti, D. (2024). An overview on the advancements of support vector machine models in healthcare applications: a review. Information, 15(4), 235.
    Guo, W., Alham, N. K., Liu, Y., Li, M., & Qi, M. (2016). A resource aware MapReduce based parallel SVM for large scale image classifications. Neural Processing Letters, 44, 161-184. https://doi.org/10.1007/s11063-015-9472-z
    Gutiérrez, P. A., & García, S. (2016). Current prospects on ordinal and monotonic classification. Progress in Artificial Intelligence, 5(3), 171-179.
    Hestenes, M. R., & Stiefel, E. (1952). Methods of conjugate gradients for solving linear systems. Journal of Research of the National Bureau of Standards, 49(6), 409-436.
    Hou, J. (2021). Online teaching quality evaluation model based on support vector machine and decision tree. Journal of Intelligent & Fuzzy Systems, 40(2), 2193-2203.
    Hu, Q., Che, X., Zhang, L., Zhang, D., Guo, M., & Yu, D. (2012). Rank entropy-based decision trees for monotonic classification. IEEE Transactions on Knowledge and Data Engineering, 24(11), 2052-2064. https://doi.org/10.1109/TKDE.2011.149
    Huang, S., Cai, N., Pacheco, P. P., Narrandes, S., Wang, Y., & Xu, W. (2018). Applications of support vector machine (SVM) learning in cancer genomics. Cancer genomics & proteomics, 15(1), 41-51. https://doi.org/10.21873/cgp.20063
    Jaafari, A. (2024). Landslide susceptibility assessment using novel hybridized methods based on the support vector regression. Ecological Engineering, 208, 107372.
    Joshi, A. V. (2022). Support vector machines. In Machine learning and artificial intelligence (pp. 89-99). Springer.
    Kaya, M., & Bilge, H. Ş. (2019). Deep metric learning: A survey. Symmetry, 11(9), 1066. https://doi.org/10.3390/sym11091066
    Kok, Z. H., Shariff, A. R. M., Alfatni, M. S. M., & Khairunniza-Bejo, S. (2021). Support vector machine in precision agriculture: A review. Computers and Electronics in Agriculture, 191, 106546.
    Kumaran, K., Papageorgiou, D. J., Takac, M., Lueg, L., & Sahinidis, N. V. (2021). Active metric learning for supervised classification. Computers & Chemical Engineering, 144, 107132. https://doi.org/10.1016/j.compchemeng.2020.107132
    Kurani, A., Doshi, P., Vakharia, A., & Shah, M. (2023). A comprehensive comparative study of artificial neural network (ANN) and support vector machines (SVM) on stock forecasting. Annals of Data Science, 10(1), 183-208.
    Lauer, F., & Bloch, G. (2008). Incorporating prior knowledge in support vector machines for classification: A review. Neurocomputing, 71(7-9), 1578-1594.
    Li, H. (2023). Support vector machine. In Machine learning methods (pp. 127-177). Springer.
    Li, S.-T., & Chen, C.-C. (2015). A regularized monotonic fuzzy support vector machine model for data mining with prior knowledge. IEEE Transactions on Fuzzy Systems, 23(5), 1713-1727.
    Li, S.-T., Shiue, W., & Huang, M.-H. (2006). The evaluation of consumer loans using support vector machines. Expert Systems with Applications, 30(4), 772-782. https://doi.org/10.1016/j.eswa.2005.07.041
    Li, T. F., Liu, Z. Y., & Du, Z. B. (2014). A Projected Conjugate Gradient Method for Box-Constrained Optimization. Advanced Materials Research, 989, 2406-2409. https://doi.org/10.4028/www.scientific.net/AMR.989-994.2406
    Lin, C.-F., & Wang, S.-D. (2002). Fuzzy support vector machines. IEEE Transactions on Neural Networks, 13(2), 464-471. https://doi.org/10.1109/72.991432
    Liu, C., Fakharizadi, E., Xu, T., & Yu, P. S. (2023). Recent advances in domain-driven data mining. International Journal of Data Science and Analytics, 15(1), 1-7.
    Liu, D., Zhang, W., Tang, Y., Xie, B., Shi, Q., & Cao, K. (2024). Evolving support vector regression based on improved grey wolf optimization for predicting settlement during construction of high-filled roadbed. Transportation Geotechnics, 45, 101233.
    Maes, C. M. (2010). A regularized active-set method for sparse convex quadratic programming. Stanford University.
    Mienye, I. D., & Swart, T. G. (2024). A comprehensive review of deep learning: Architectures, recent advances, and applications. Information, 15(12), 755.
    Moradi, S., & Mokhatab Rafiei, F. (2019). A dynamic credit risk assessment model with data mining techniques: evidence from Iranian banks. Financial Innovation, 5(1), 1-27. https://doi.org/10.1186/s40854-019-0121-9
    Muthukrishnan, S., Krishnaswamy, H., Thanikodi, S., Sundaresan, D., & Venkatraman, V. (2020). Support vector machine for modelling and simulation of Heat exchangers. Thermal Science, 24(1 Part B), 499-503. https://doi.org/10.2298/TSCI190419398M
    Nazareth, J. L. (2009). Conjugate gradient method. Wiley Interdisciplinary Reviews: Computational Statistics, 1(3), 348-353.
    Pei, S., Hu, Q., & Chen, C. (2016). Multivariate decision trees with monotonicity constraints. Knowledge-Based Systems, 112, 14-25.
    Pelckmans, K., Espinoza, M., De Brabanter, J., Suykens, J. A., & De Moor, B. (2005). Primal-dual monotone kernel regression. Neural Processing Letters, 22, 171-182.
    Petković, D., Shamshirband, S., Saboohi, H., Ang, T. F., Anuar, N. B., & Pavlović, N. D. (2014). RETRACTED ARTICLE: Support vector regression methodology for prediction of input displacement of adaptive compliant robotic gripper. Applied Intelligence, 41, 887-896.
    Pisner, D. A., & Schnyer, D. M. (2020). Support vector machine. In Machine learning (pp. 101-121). Elsevier. https://doi.org/10.1016/B978-0-12-815739-8.00006-7
    Platt, J. C. (1999). Fast training of support vector machines using sequential minimal optimization. Advances in Kernel Methods, 185-208.
    Potharst, R., & Feelders, A. J. (2002). Classification trees for problems with monotonicity constraints. ACM SIGKDD Explorations Newsletter, 4(1), 1-10.
    Qian, Y., Xu, H., Liang, J., Liu, B., & Wang, J. (2015). Fusing monotonic decision trees. IEEE Transactions on Knowledge and Data Engineering, 27(10), 2717-2728.
    Rademaker, M., De Baets, B., & De Meyer, H. (2012). Optimal monotone relabelling of partially non-monotone ordinal data. Optimization methods and software, 27(1), 17-31.
    Rahman, F. A., Haris, N. A., Kassim, R., Baharum, Z., Noor, H. A. M., & Ahmad, F. (2022). Domain-Driven Data Mining Framework for Effective Decisions. In Advanced Transdisciplinary Engineering and Technology (pp. 43-47). Springer.
    Roy, A., & Chakraborty, S. (2023). Support vector machine in structural reliability analysis: A review. Reliability Engineering & System Safety, 233, 109126.
    Runje, D., & Shankaranarayana, S. M. (2023). Constrained monotonic neural networks. International Conference on Machine Learning,
    Sabzekar, M., & Hasheminejad, S. M. H. (2021). Robust regression using support vector regressions. Chaos, Solitons & Fractals, 144, 110738.
    Schölkopf, B., & Smola, A. J. (2002). Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT press.
    Serafini, T., Zanghirati, G., & Zanni, L. (2005). Gradient projection methods for quadratic programs and applications in training support vector machines. Optimization methods and software, 20(2-3), 353-378. https://doi.org/10.1080/10556780512331318182
    Serafini, T., & Zanni, L. (2005). On the working set selection in gradient projection-based decomposition techniques for support vector machines. Optimization methods and software, 20(4-5), 583-596. https://doi.org/10.1080/10556780500140714
    Shah, S., Sardeshmukh, A., Ahmed, S., & Reddy, S. (2016). Soft monotonic constraint support vector regression. COMAD,
    Sharifani, K., & Amini, M. (2023). Machine learning and deep learning: A review of methods and applications. World Information Technology and Engineering Journal, 10(07), 3897-3904.
    Shariff, M. (1995). A constrained conjugate gradient method and the solution of linear equations. Computers & Mathematics with Applications, 30(11), 25-37. https://doi.org/10.1016/0898-1221(95)00161-Q
    Sharma, A., & Wehrheim, H. (2020). Testing monotonicity of machine learning models. arXiv preprint arXiv:2002.12278.
    Shu, X., & Ye, Y. (2023). Knowledge Discovery: Methods from data mining and machine learning. Social Science Research, 110, 102817.
    Simonsmeier, B. A., Flaig, M., Deiglmayr, A., Schalk, L., & Schneider, M. (2022). Domain-specific prior knowledge and learning: A meta-analysis. Educational psychologist, 57(1), 31-54.
    Singh, K. R., Neethu, K., Madhurekaa, K., Harita, A., & Mohan, P. (2021). Parallel SVM model for forest fire prediction. Soft Computing Letters, 3, 100014. https://doi.org/10.1016/j.socl.2021.100014
    Singh, V., Poonia, R. C., Kumar, S., Dass, P., Agarwal, P., Bhatnagar, V., & Raja, L. (2020). Prediction of COVID-19 corona virus pandemic based on time series data using Support Vector Machine. Journal of Discrete Mathematical Sciences and Cryptography, 23(8), 1583-1597. https://doi.org/10.1080/09720529.2020.1784535
    SONG, Y., KONG, X., DING, R., LIU, Z., & HUANG, Z. (2022). Large-Scale Support Vector Machine Classification Algorithm Based on Granulation Mechanism. Applied Mathematics, Modeling and Computer Simulation: Proceedings of AMMCS 2021, 20, 302.
    Steihaug, T. (1983). The conjugate gradient method and trust regions in large scale optimization. SIAM Journal on Numerical Analysis, 20(3), 626-637. https://doi.org/doi.org/10.1137/0720042
    Sun, J., Fujita, H., Zheng, Y., & Ai, W. (2021). Multi-class financial distress prediction based on support vector machines integrated with the decomposition and fusion methods. Information Sciences, 559, 153-170.
    Sun, L., & Li, T. (2025). Using Novel Optimization Algorithms with Support Vector Regression to Estimate Pile Settlement Rates. Indian Geotechnical Journal, 55(1), 79-91.
    Vapnik, V. (2013). The nature of statistical learning theory. Springer science & business media.
    Vapnik, V., Golowich, S., & Smola, A. (1996). Support vector method for function approximation, regression estimation and signal processing. Advances in neural information processing systems, 9.
    Vapnik, V. N. (1999). An overview of statistical learning theory. IEEE Transactions on Neural Networks, 10(5), 988-999. https://doi.org/10.1109/72.788640
    Willoughby, R. A. (1979). Solutions of ill-posed problems (AN Tikhonov and VY Arsenin). Siam Review, 21(2), 266.
    Wismer, D., & Chattergy, R. (1978). Introduction to Nonlinear Optimization, Noth-Holland Pub. Company, Amsterdam.
    Wook, M., Hasbullah, N. A., Zainudin, N. M., Jabar, Z. Z. A., Ramli, S., Razali, N. A. M., & Yusop, N. M. M. (2021). Exploring big data traits and data quality dimensions for big data analytics application using partial least squares structural equation modelling. Journal of Big Data, 8, 1-15. https://doi.org/10.1186/s40537-021-00439-5
    Xi, X., Shi, H., Han, L., Wang, T., Ding, H. Y., Zhang, G., Tang, Y., & Yin, Y. (2017). Breast tumor segmentation with prior knowledge learning. Neurocomputing, 237, 145-157. https://doi.org/10.1016/j.neucom.2016.09.067
    Yamaguchi, K., & Templin, J. (2022). A Gibbs sampling algorithm with monotonicity constraints for diagnostic classification models. Journal of Classification, 39(1), 24-54. https://doi.org/10.1007/s00357-021-09392-7
    Ye, Y., & Tse, E. (1989). An extension of Karmarkar's projective algorithm for convex quadratic programming. Mathematical programming, 44, 157-179.
    Zhang, F., & O'Donnell, L. J. (2020). Support vector regression. In Machine learning (pp. 123-140). Elsevier.
    Zhang, J.-P., Li, Z.-W., & Yang, J. (2005). A parallel SVM training algorithm on large-scale classification problems. 2005 international conference on machine learning and cybernetics,
    Zhang, Z., & Hong, W.-C. (2021). Application of variational mode decomposition and chaotic grey wolf optimizer with support vector regression for forecasting electric loads. Knowledge-Based Systems, 228, 107297.
    Zhou, J., Qiu, Y., Zhu, S., Armaghani, D. J., Li, C., Nguyen, H., & Yagiz, S. (2021). Optimization of support vector machine through the use of metaheuristic algorithms in forecasting TBM advance rate. Engineering Applications of Artificial Intelligence, 97, 104015. https://doi.org/10.1016/j.engappai.2020.104015
    Zhou, J., Zhu, S., Qiu, Y., Armaghani, D. J., Zhou, A., & Yong, W. (2022). Predicting tunnel squeezing using support vector machine optimized by whale optimization algorithm. Acta Geotechnica, 17(4), 1343-1366.

    無法下載圖示 校內:2030-08-27公開
    校外:2030-08-27公開
    電子論文尚未授權公開,紙本請查館藏目錄
    QR CODE