簡易檢索 / 詳目顯示

研究生: 簡柏翔
Chien, Bo-Shiang
論文名稱: 在電腦實驗下針對高斯過程模型之變數篩選程序
Screening Procedure for Gaussian Process Models in Computer Experiments
指導教授: 陳瑞彬
Chen, Ray-Bing
學位類別: 碩士
Master
系所名稱: 管理學院 - 統計學系
Department of Statistics
論文出版年: 2021
畢業學年度: 109
語文別: 英文
論文頁數: 46
中文關鍵詞: 變數篩選高斯過程模型電腦實驗
外文關鍵詞: Screening, Gaussian Process Models, Computer Experiments
相關次數: 點閱:93下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在過去的近幾十年來,電腦實驗受到了廣泛的關注,並在解決不同的科學和工程問題中發揮著越來越重要的作用。而在電腦實驗的領域中,高斯過程是一個最常被使用的模型。因此,本論文主要是研究在高斯過程模型下的變數篩選問題。為了降低計算成本,本論文採用梯度方法而不是貝葉斯方法來構建變數篩選的演算法。此外,為了同時考慮高斯過程模型的均值函數和相關函數,本論文也提出了幾種梯度選擇方法來標準化模型的兩種類型的梯度值。因此,在此基礎上,本論文提出了兩種變數篩選演算法,分別是向前篩選程序 (Forward Screening Procedure) 和批量向後篩選程序 (Batchwise Backward Screening Procedure)。最後,使用兩種不同的模擬研究來說明所提出方法的性能。

    In recent decades, computer experiments have received widespread attention from all over the world, and have played an increasingly important role in solving different sciences and engineering. In this thesis, we are interested in the variable screening problems for Gaussian process models in computer experiments. However, to reduce the computational cost, we adopt the gradient approach instead of the Bayesian approach to construct our variable screening algorithm. In addition, to consider the mean function and the correlation function of the Gaussian process model simultaneously, we propose several gradient selection methods to standardize the two types of gradient values. Therefore, we propose two variable screening algorithms, namely the Forward Screening Procedure (FSP) and the Batchwise Backward Screening Procedure (BBSP). Lastly, two different simulation studies are used to illustrate the performance of the proposed approaches.

    摘要 i Abstract ii 誌謝 iii Table of Contents iv List of Tables v List of Figures vi Chapter 1 Introduction 1 Chapter 2 Gaussian Process Models 4 2.1 Gaussian Process Model 4 2.2 Noise Gaussian Process Model 7 Chapter 3 Forward Screening Procedure 8 3.1 Gradient for the Noise Gaussian Process Model 9 3.2 Gradient Selection Methods 12 3.3 The Algorithm of Forward Screening Procedure 16 3.4 Numerical Results of FSP 20 3.4.1. Simulation Data from Gaussian Process model 21 3.4.2. Artificial Landscapes Data 25 Chapter 4 Batchwise Backward Screening Procedure 34 4.1 The Algorithm of Batchwise Backward Screening Procedure 34 4.2 Numerical Results of BBSP 38 4.2.1. Simulation Data from Gaussian Process model 38 4.2.2. Artificial Landscapes Data 40 Chapter 5 Conclusion 42 References 44 Appendix A Matrix calculus formulas for deriving the gradients 45 Appendix B Simulation Results of Zhang's approach 46

    Andrianakis, I., & Challenor, P. G. (2012). The effect of the nugget on gaussian process emulators of computer models. Computational Statistics & Data Analysis, 56(12), 4215–4228.

    Dellaportas, P., Forster, J. J., & Ntzoufras, I. (2002). On bayesian model and variable selection using mcmc. Statistics and Computing, 12(1), 27–36.

    Eberhart, R., & Kennedy, J. (1995). Particle swarm optimization. In Proceedings of the ieee international conference on neural networks (Vol. 4, pp. 1942–1948).

    Fan, J., & Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American statistical Association, 96(456), 1348–1360.

    Forrester, A., Sobester, A., & Keane, A. (2008). Engineering design via surrogate modelling: a practical guide. John Wiley & Sons.

    George, E. I., & McCulloch, R. E. (1993). Variable selection via gibbs sampling. Journal of the American Statistical Association, 88(423), 881–889.

    Guyon, I., & Elisseeff, A. (2003). An introduction to variable and feature selection. Journal of machine learning research, 3(Mar), 1157–1182.

    Hocking, R. R. (1976). A biometrics invited paper. the analysis and selection of variables in linear regression. Biometrics, 1–49.

    Huang, H., Lin, D. K., Liu, M.-Q., & Zhang, Q. (2020). Variable selection for kriging in computer experiments. Journal of Quality Technology, 52(1), 40–53.

    Lee, K.-J., Chen, R.-B., & Wu, Y. N. (2016). Bayesian variable selection for finite mixture model of linear regressions. Computational Statistics & Data Analysis, 95, 1–16.

    Linkletter, C., Bingham, D., Hengartner, N., Higdon, D., & Ye, K. Q. (2006). Variable selection for gaussian process models in computer experiments. Technometrics, 48(4), 478–490.

    Sacks, J., Schiller, S. B., & Welch, W. J. (1989). Designs for computer experiments. Technometrics, 31(1), 41–47.

    Sacks, J., Welch, W. J., Mitchell, T. J., & Wynn, H. P. (1989). Design and analysis of
    computer experiments. Statistical science, 409–423.

    Singh, S., Kubica, J., Larsen, S., & Sorokina, D. (2009). Parallel large scale feature selection for logistic regression. In Proceedings of the 2009 siam international conference on data mining (pp. 1172–1183).

    Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 58(1), 267–288.

    Zhang, F. (2018). Bayesian indicator variable selection in gaussian process for computer experiments (Unpublished master's thesis). National Cheng Kung University, Tainan.

    Zou, H. (2006). The adaptive lasso and its oracle properties. Journal of the American statistical association, 101(476), 1418–1429.

    無法下載圖示 校內:2026-07-25公開
    校外:2026-07-25公開
    電子論文尚未授權公開,紙本請查館藏目錄
    QR CODE