簡易檢索 / 詳目顯示

研究生: 張凡
Zhang, Fan
論文名稱: 在電腦實驗中針對高斯過程之貝式指標型變數選擇法
Bayesian Indicator Variable Selection in Gaussian Process for Computer Experiments
指導教授: 陳瑞彬
Chen, Ray-Bing
學位類別: 碩士
Master
系所名稱: 管理學院 - 統計學系
Department of Statistics
論文出版年: 2018
畢業學年度: 106
語文別: 英文
論文頁數: 30
中文關鍵詞: 變數選擇指標型變數貝氏方法高斯過程電腦實驗
外文關鍵詞: Variable Selection, Indicator, Bayesian Approach, Gaussian Process, Computer Experiments
相關次數: 點閱:199下載:16
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在過去的近三十年中,電腦實驗受到了廣泛的關注。並在解決不同的科學和工 程問題中發揮著越來越重要的作用。本文基於高斯過程模型提出了變數選擇的算法, 不僅關註模型中的均值函數,還會考慮到變數間相關結構。因此,本文引入了指標 型變數,用以表示對應自變數的重要與否。在此基礎上,本文提出了兩種具有分層 先驗分配的高斯過程模型變數選擇算法,並用模擬資料和實際資料分別進行算法的 應用與探討。

    In the past three decades, the analysis of computer experiments has received a lot of attention and plays a more and more important role in solving different scientific and engineering problems. In this thesis, we are interested in the variable selection problems for Gaussian process model. In computer experiment here, we not only focus on the mean function, but also take covariance structure into account. To accomplish our goal, indicators are added into the model to denote if the variables are active or not. Two Bayesian variable selection algorithms are proposed. In addition to the simulation studies, several real examples are also used to illustrate the of the proposed methods.

    摘要 i Abstract ii Acknowledgements iii Table of Contents iv List of Tables v List of Figures vi Chapter 1. Introduction 1 Chapter 2. Gaussian Process Model 4 2.1 Building the Gaussian Process Model 4 2.2 Gaussian Process Prediction 5 Chapter 3. Variable Selection in Gaussian Process with Hierarchical Prior 7 3.1 Single Indicator for Variable Selection in Gaussian Process 7 3.2 Two Indicators for Variable Selection in Gaussian Process 10 Chapter 4. Simulations and Comparison 13 4.1 Simulation Results of Algorithm 1 14 4.2 Simulation Results of Algorithm 2 19 Chapter 5. Real Examples 20 Chapter 6. Conclusion 28 References 29

    [1] L Mark Berliner. Monte carlo based ensemble forecasting. Statistics and Computing, 11(3):269–275, 2001.
    [2] G Casella and T Park. Bayesianlasso. J. Amer. Statist. Assoc, 103(482):681–686, 2008.
    [3] Carla Currin, Toby Mitchell, Max Morris, and Don Ylvisaker. Bayesian prediction of deterministic functions, with applications to the design and analysis of computer experiments. Journal of the American Statistical Association, 86(416):953–963, 1991.
    [4] Edward I George and Robert E McCulloch. Variable selection via gibbs sampling. Journal of the American Statistical Association, 88(423):881–889, 1993.
    [5] Robert B Gramacy and Herbert K H Lee. Bayesian treed gaussian process models with an application to computer modeling. Journal of the American Statistical Association, 103(483):1119–1130, 2008.
    [6] Isabelle Guyon and André Elisseeff. An introduction to variable and feature selection. Journal of machine learning research, 3(Mar):1157–1182, 2003.
    [7] D. Higdon, H. Lee, C. Holloman, J.M. Bernardo, M. Bayarri, O. Berger, A.P. Dawid, D. Heckerman, and A.F.M. Smith. Markov chain monte carlo-based approaches for inference in computationally intensive inverse problems. Bayesian Statistics 7, pages 181–197, 2003.
    [8] Dave Higdon, James Gattiker, Brian Williams, and Maria Rightley. Computer model calibration using high-dimensional output. Journal of the American Statistical Associ- ation, 103(482):570–583, 2008.
    [9] V Roshan Joseph. Limit kriging. Technometrics, 48(4):458–466, 2006.
    [10] V. Roshan Joseph, Ying Hung, and Agus Sudjianto. Blind kriging: A new method for developing metamodels. Journal of Mechanical Design, 130(3):031102–031102–8, 2008.
    [11] Jari Kaipio and Erkki Somersalo. Statistical and computational inverse problems, vol- ume 160. Springer Science & Business Media, 2006.
    [12] Marc C Kennedy and Anthony O’Hagan. Predicting the output from a complex com- puter code when fast approximations are available. Biometrika, 87(1):1–13, 2000.
    [13] Kuo-Jung Lee, Ray-Bing Chen, and Ying Nian Wu. Bayesian variable selection for finite mixture model of linear regressions. Computational Statistics Data Analysis, 95:1–16, 2016.
    [14] Crystal Linkletter, Derek Bingham, Nicholas Hengartner, David Higdon, and Kenny Q Ye. Variable selection for gaussian process models in computer experiments. Techno- metrics, 48(4):478–490, 2006.
    [15] Toby J Mitchell and Max D Morris. Bayesian design and analysis of computer experi- ments: two examples. Statistica Sinica, pages 359–379, 1992.
    [16] Jeremy Oakley and Anthony O’Hagan. Bayesian inference for the uncertainty distribu- tion of computer model outputs. Biometrika, 89(4):769–784, 2002.
    [17] Jerome Sacks, Susannah B Schiller, and William J Welch. Designs for computer exper- iments. Technometrics, 31(1):41–47, 1989.
    [18] Jerome Sacks, William J Welch, Toby J Mitchell, and Henry P Wynn. Design and analysis of computer experiments. Statistical science, pages 409–423, 1989.
    [19] Thomas J Santner, Brian J Williams, and William I Notz. The design and analysis of computer experiments. Springer Science & Business Media, 2013.
    [20] Robert Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B (Methodological), pages 267–288, 1996.
    [21] William J. Welch, Robert. J. Buck, Jerome Sacks, Henry P. Wynn, Toby J. Mitchell, and Max D. Morris. Screening, predicting, and computer experiments. Technometrics, 34(1):15–25, 1992.

    下載圖示 校內:2020-12-31公開
    校外:2020-12-31公開
    QR CODE