簡易檢索 / 詳目顯示

研究生: 賴暐婷
Lai, Wei-Ting
論文名稱: 變分貝式結構選擇方法
Variational Bayesian Approaches for Structure Selection
指導教授: 陳瑞彬
Chen, Ray-Bing
學位類別: 博士
Doctor
系所名稱: 管理學院 - 統計學系
Department of Statistics
論文出版年: 2022
畢業學年度: 110
語文別: 英文
論文頁數: 61
中文關鍵詞: 變分貝式推論馬可夫鏈蒙地卡羅演算法動態網路
外文關鍵詞: Variational Bayesian inference, MCMC algortihm, Dynamic network
相關次數: 點閱:78下載:6
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本文中,我們針對 3 種不同的模型分別為: 混合線性迴歸模型、網路自迴歸/向量
    自迴歸模型 (NAR/VAR) 和 NAR-DeGARCH 模型提出變分貝式 (VB) 結構選擇方法。
    VB 方法主要嘗試直接獲得最佳的近似後驗分配並且自動辨識給定模型的動態結構。
    相較於爾可夫鏈蒙特卡羅(MCMC)的抽樣選擇方法 VB 方法可以通過犧牲少量的
    估計精準度來達到更高的計算效率。在此論文中,不同模型的模擬設置來展示所提
    出 VB 方法的性能。此外我們還通過兩個實際資料來顯示模型的效率。

    In this paper, we develop variational Bayesian (VB) methods for structure selection problems in respect to three different models: the mixture linear regression model, the network autoregressive (NAR) model, and the NAR-DeGARCH model. Basically, the VB method attempts to directly obtain the best approximation of the posterior density, which can be used to automatically identify the dynamic structures for a given model. Compared with Markov chain Monte Carlo (MCMC)-based sampling selection methods, the VB method achieves a higher computational efficiency by sacrificing a small amount of estimation accuracy. In this thesis, simulations with different model setups are used to demonstrate the performance of the proposed VB methods. In addition, we offer illustrations of the model's efficiency with two real examples.

    摘要 i Abstract ii 誌謝 iii Table of Contents iv List of Tables vi List of Figures vii Chapter 1. Introduction 1 Chapter 2. Variational Bayesian Inference for Structure Selection in Network Autoregression Models / Vector autoregressioin Models 5 2.1. Network Autoregression Model / Vector autoregressioin Model 5 2.2. Structure Assumption 6 2.3. Variational Approximation Algorithm in NAR/VAR Model 7 2.3.1. The Bayesian Structure Selection Algorithm 7 2.3.2. The Variational Inference Procedure 8 2.4. Simulations 13 2.4.1. Medium-Sized Example 13 2.4.2. Large-Sized Example 14 2.4.3. Simulation Results 15 2.4.4. Simulations with A Large Number of Lags 22 2.5. Empirical Study 25 Chapter 3. Variational Bayesian Inference for NAR-DeGARCH Model 30 3.1. NAR-DeGARCH Model 30 3.2. Varational Approximation Algorithm in NAR-DeGARCH Model 30 3.2.1. The Variational Inference Procedure 31 3.3. Simulations 33 3.3.1. Simulation Results 33 3.4. Empirical Study 35 Chapter 4. Variational Bayesian Inference for Mixture Linear Regression Models 41 4.1. Mixture Linear Regressions Model 41 4.2. Varational Approximation Algorithm in The Mixture Linear Regression Model 42 4.2.1. The Variational Inference Procedure 42 4.3. Simulations 47 4.3.1. Simulation Results 47 4.3.2. Simulations with Larger Sample Size and Different p 49 Chapter 5. Conclusion 51 Appendix A. Appendix 53 A.1. Nonzero Coefficient Matrices in the Simulations for All Cases 53 A.2. MAPEs and NMSEs for In-sample Training and Out-of-Sample Forecasting in Real Data 58 References 60

    References
    Bańbura, M., Giannone, D., & Reichlin, L. (2010). Large bayesian vector auto regressions. Journal of applied Econometrics, 25(1), 71–92.
    Barbieri, J. O., M M & Berger. (2004). Optimal predictive model selection. The Annals of Statistics, 32, 870–897.
    Beattie, S. D., Fong, D. K. H., & Lin, D. K. J. (2002). A two-stage bayesian model selection strategy for supersaturated designs. Technometrics, 44(1), 55–63.
    Cai, M., Dai, M., Ming, J., Peng, H., Liu, J., & Yang, C. (2020). Bivas: a scalable Bayesian method for bi-level variable selection with applications. Journal of Computational and Graphical Statistics, 29(1), 40–52.
    Carbonetto, P., & Stephens, M. (2012). Scalable variational inference for Bayesian variable selection in regression, and its accuracy in genetic association studies. Bayesian Analysis, 7, 73–108.
    Chen, R.-B., Chu, C.-H., Lai, T.-Y., & Wu, Y. N. (2011). Stochastic matching pursuit for bayesian variable selection. Statistics and Computing, 21, 247–259.
    Chen, R.-B., Chu, C.-H., Yuan, S., & Wu, Y. N. (2016). Bayesian sparse group selection. Journal of Computational and Graphical Statistics, 25, 665–683.
    Chen, R.-B., Weng, J.-Z., & Chu, C.-H. (2013). Screening procedure for supersaturated designs using a bayesian variable selection method. Quality and Reliability Engineering International, 29(1), 89–101.
    Chu, C.-H., Lo Huang, M.-N., Huang, S.-F., & Chen, R.-B. (2019). Bayesian structure selection for vector autoregression model. Journal of Forecasting, 38, 422–439.
    Farcomeni, A. (2010). Bayesian constrained variable selection. Statistica Sinica, 20, 1043–1062.
    Févotte, C., & Godsill, S. J. (2006). Sparse linear regression in unions of bases via bayesian variable selection. IEEE Signal Processing Letters, 13(7), 441–444.
    Févotte, C., Godsill, S. J., & Wolfe, P. J. (2004). Bayesian approach for blind separation of underdetermined mixtures of sparse sources. In International conference on independent component analysis and signal separation (pp. 398–405).
    George, E. I., & McCulloch, R. E. (1993). Variable selection via gibbs sampling. Journal of the American Statistical Association, 88, 881–889.
    Geweke, J. (1996). Variable selection and model comparison in regression. In Bayesian Statistics, 5, 609–-620.
    Huang, D., Zhu, X., Li, R., & Wang, H. (2021). Feature screening for network autoregression model. Statistica Sinica.
    Huang, S.-F., Chiang, H.-H., & Lin, Y.-J. (2021). A network autoregressive model with garch effects and its applications. Plos one, 16(7), e0255422.
    Khalili, A., & Chen, J. (2007). Variable selection in finite mixture of regression models. Journal of the american Statistical association, 102(479), 1025–1038.
    Lee, K. E., Sha, N., Dougherty, E. R., Vannucci, M., & Mallick, B. K. (2003). Gene selection: a bayesian variable selection approach. Bioinformatics, 19(1), 90–97.
    Lee, K.-J., Chen, R.-B., & Wu, Y. N. (2016). Bayesian variable selection for finite mixture model of linear regressions. Computational Statistics & Data Analysis, 95, 1–16.
    Lütkepohl, H. (2007). General-to-specific or specific-to-general modelling? an opinion on current econometric terminology. Journal of Econometrics, 136(1), 319–324.
    Ormerod, J. T., You, C., Müller, S., et al. (2017). A variational Bayes approach to variable selection. Electronic Journal of Statistics, 11(2), 3549–3594.
    Simon, N., Friedman, J., Hastie, T., & Tibshirani, R. (2013). A sparse-group LASSO. Journal of Computational and Graphical Statistics, 22, 231–245.
    Song, S., & Bickel, P. J. (2011). Large vector auto regressions. arXiv preprint arXiv:1106.3915.
    Städler, N., Bühlmann, P., & Van De Geer, S. (2010). ℓ1-penalization for mixture regression models. Test, 19(2), 209–256.
    Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 58(1), 267–288.
    Titsias, M. K., & Lázaro-Gredilla, M. (2011). Spike and slab variational inference for multitask and multiple kernel learning. Advances in Neural Information Processing Systems, 24, 2339-2347.
    Yuan, M., & Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 68, 49–67.
    Zhang, C.-X., Xu, S., & Zhang, J.-S. (2019). A novel variational Bayesian method for variable selection in logistic regression models. Computational Statistics & Data Analysis, 133, 1–19.
    Zhu, X., & Pan, R. (2020). Grouped network vector autoregression. Statistica Sinca, 30, 1437–1462.
    Zhu, X., Pan, R., Li, G., Liu, Y., & Wang, H. (2017). Network vector autoregression. The Annals of Statistics, 45(3), 1096–1123.

    下載圖示 校內:2024-07-31公開
    校外:2024-07-31公開
    QR CODE