| 研究生: |
鄭佳鈴 Cheng, Chia-Ling |
|---|---|
| 論文名稱: |
應用類別樹高斯過程模型於多目標最佳化 Multi-Objective Optimization Based On Category Tree Gaussian Process |
| 指導教授: |
陳瑞彬
Chen, Ray-Bing |
| 學位類別: |
碩士 Master |
| 系所名稱: |
管理學院 - 統計學系 Department of Statistics |
| 論文出版年: | 2023 |
| 畢業學年度: | 111 |
| 語文別: | 中文 |
| 論文頁數: | 44 |
| 中文關鍵詞: | 電腦實驗 、高斯過程 、定量及定性因子 |
| 外文關鍵詞: | Computer experiments, Gaussian process, Quantitative and qualitative factors |
| 相關次數: | 點閱:74 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
電腦實驗廣泛地應用在許多工程領域中,除了單一反應變數外,電腦實驗也可以 生成多個反應變數。針對多反應變數的情境,一個常見的目標是能同時取得所有反 應變數的最佳值,因此多目標最佳化 (Multi-Objective Optimization, MOO) 的重要性 日益增加。另一方面,在真實世界中,需要同時處理包含定量及定性因子的問題並 不少見,也已經有許多相關的研究提出相對應的架構。然而,當定性因子的水準組 合數量龐大時,常用的建模方法可能會出現過度參數化的問題。在此,我們將這兩 種情境結合在一起,討論同時考慮混合因子情境下之多目標最佳化問題。由於類別 樹高斯過程 (category tree Gaussian process, ctGP) 模型應用在混合因子的單一反應變 數問題中有不錯的表現,在本研究中將對其進行修改以適應多目標反應變數的情況, 再搭配相對應多目標最佳化之選點準則,探討類別樹高斯過程應用在多目標最佳化 問題上的可行性。我們除了嘗試數種不同情境的數值實驗,並由 ANSYS 軟體模擬的 電子元件散熱鰭片資料作為實例分析,以驗證所提方法之效能。
Computer experiments are widely used in various engineering fields. In addition to the single objective response, the corresponding computer code could generate the multiple responses. A common goal is to optimize these objectives simultaneously, and thus the multi-objective optimization (MOO) becomes more and more important. On the other hand, in practices, we might not only consider the quantitative factor but also qualitative factors. It is a mixed-input type variable structure. When the number of the level com- binations (categories) generated from the qualitative factors is huge, the commonly used modeling approaches would have the over-parameterized problem. Here we integrate these two scenarios together. That is we are interesting in the multi-task model fitting problem with mixed-inputs. Since the category tree Gaussian process (ctGP) model has been suc- cessfully used for surrogate model with mixed inputs, and thus we would modify it for the multi-objective responses. Here several numerical experiments and a cooling system design problem were used to illustrate the performance of the proposed modeling method.
[1] I. Bajaj, A. Arora, and M. F. Hasan, “Black-box optimization: Methods and applications,” in Black box optimization, machine learning, and no-free lunch theorems. Springer, 2021, pp. 35–65.
[2] P. I. Frazier, “A tutorial on bayesian optimization,” arXiv preprint arXiv:1807.02811, 2018.
[3] J. Snoek, H. Larochelle, and R. P. Adams, “Practical bayesian optimization of machine learning algorithms,” Advances in neural information processing systems, vol. 25, 2012.
[4] S. Cho, M. Kim, J. Lee, A. Han, J. Na, and I. Moon, “Multi-objective optimization of explosive waste treatment process considering environment via bayesian active learning,” Engineering Applications of Artificial Intelligence, vol. 117, p. 105463, 2023.
[5] J. Lukemire, Q. Xiao, A. Mandal, and W. K. Wong, “Statistical analysis of complex computer models in astronomy,” The European Physical Journal Special Topics, vol. 230, no. 10, pp. 2253–2263, 2021.
[6] J. M. Sestito, M. L. Thatcher, L. Shu, T. A. Harris, and Y. Wang, “Coarse-grained force field calibration based on multiobjective bayesian optimization to simulate water diffusion in poly-e-caprolactone,” The Journal of Physical Chemistry A, vol. 124, no. 24, pp. 5042–5052, 2020.
[7] Y. Zhang, D. W. Apley, and W. Chen, “Bayesian optimization for materials design with mixed quantitative and qualitative variables,” Scientific reports, vol. 10, no. 1, p. 4924, 2020.
[8] Y. Cui, Z. Geng, Q. Zhu, and Y. Han, “Multi-objective optimization methods and application in energy saving,” Energy, vol. 125, pp. 681–704, 2017.
[9] A. Sobester, A. Forrester, and A. Keane, Engineering design via surrogate modelling: a practical guide. John Wiley & Sons, 2008.
[10] P. Z. Qian, H. Wu, and C. J. Wu, “Gaussian process models for computer experiments with qualitative and quantitative factors,” Technometrics, vol. 50, no. 3, pp. 383–396, 2008.
[11] M. Halstrup, “Black-box optimization of mixed discrete-continuous optimization problems,” Ph.D. dissertation, Dissertation, Dortmund, Technische Universität, 2016.
[12] X. Deng, C. D. Lin, K.-W. Liu, and R. Rowe, “Additive gaussian process for computer models with qualitative and quantitative factors,” Technometrics, 2017.
[13] Y. Zhang, S. Tao, W. Chen, and D. W. Apley, “A latent variable approach to gaussian process modeling with qualitative and quantitative factors,” Technometrics, vol. 59, no. 3, pp. 283–292, 2017.
[14] D. Eriksson and M. Poloczek, “Scalable constrained bayesian optimization,” in International Conference on Artificial Intelligence and Statistics. PMLR, 2021, pp. 730–738.
[15] D. Higdon, J. Gattiker, B. Williams, and M. Rightley, “Computer model calibration using high-dimensional output,” Journal of the American Statistical Association, vol. 103, no. 482, pp. 570–583, 2008.
[16] E. V. Bonilla, K. Chai, and C. Williams, “Multi-task gaussian process prediction,” Ad-
vances in neural information processing systems, vol. 20, 2007.
[17] K. Swersky, J. Snoek, and R. P. Adams, “Multi-task bayesian optimization,” Advances in neural information processing systems, vol. 26, 2013.
[18] H. Liu, J. Cai, and Y. Ong, “Remarks on multi-output gaussian process regression,” Knowledge-Based Systems, vol. 144, pp. 102–121, 2018.
[19] J. A. Manson, T. W. Chamberlain, and R. A. Bourne, “Mvmoo: mixed variable multiobjective optimisation,” Journal of Global Optimization, vol. 80, no. 4, pp. 865–886, 2021.
[20] 王苡璿, “應用定量及定性型高斯過程於多目標最佳化,” Master’s thesis, 國立成功大學, 2022.
[21] W.-A. Lin, C.-L. Sung, and R.-B. Chen, “Category tree gaussian process for computer experiments with many-category qualitative factors and application to cooling system design,” unpublished.
[22] D. D. Cox and S. John, “A statistical method for global optimization,” in IEEE International Conference on Systems, Man, and Cybernetics. IEEE, 1992, pp. 1241–1246.
[23] D. R. Jones, “A taxonomy of global optimization methods based on response surfaces,” Journal of global optimization, 2001.
[24] D. R. Jones, M. Schonlau, and W. J. Welch, “Efficient global optimization of expensive black-box functions,” Journal of Global optimization, vol. 21, pp. 345–383, 2001.
[25] J. Močkus, “On bayesian methods for seeking the extremum,” in Optimization Techniques IFIP Technical Conference: Novosibirsk. Springer, 1975, pp. 400–404.
[26] M. T. Emmerich, K. Giannakoglou, and B. Naujoks, “Single- and multiobjective evolutionaryoptimizationassistedbygaussianrandomfieldmetamodels,” IEEETransactions on Evolutionary Computation, vol. 10, no. 4, pp. 421–439, 2006.
[27] M. T. Emmerich, A. H. Deutz, and J. W. Klinkenberg, “Hypervolume-based expected improvement: monotonicity properties and exact computation,” in IEEE Congress of Evolutionary Computation (CEC). IEEE, 2011, pp. 2147–2154.
[28] I. Hupkens, A. Deutz, K. Yang, and M. Emmerich, “Faster exact algorithms for computing expected hypervolume improvement,” in international conference on evolutionary multi-criterion optimization. Springer, 2015, pp. 65–79.
[29] M. T. Emmerich, K. Yang, A. Deutz, H. Wang, and C. M. Fonseca, “A multicriteria generalization of bayesian global optimization,” Advances in stochastic and deterministic global optimization, pp. 229–242, 2016.
[30] K. Yang, M. T. Emmerich, A. Deutz, and C. M. Fonseca, “Computing 3-d expected hypervolume improvement and related integrals in asymptotically optimal time,” in Evolutionary Multi-Criterion Optimization: 9th International Conference, vol. 10173, 2017, pp. 685–700.
[31] A. Shah and Z. Ghahramani, “Pareto frontier learning with expensive correlated objectives,” in International conference on machine learning. PMLR, 2016, pp. 1919–1927.
[32] S. Daulton, M. Balandat, and E. Bakshy, “Differentiable expected hypervolume improvement for parallel multi-objective bayesian optimization,” Advances in Neural Information Processing Systems, vol. 33, pp. 9851–9864, 2020.
[33] L. B. Rall and G. F. Corliss, “An introduction to automatic differentiation,” Computational Differentiation: Techniques, Applications, and Tools, vol. 89, pp. 1–18, 1996.
[34] M. McKay, R. Beckham, and W. Conover, “A comparison of three methods for selecting valuesof input variables in the analysis of output from acomputer code,” Technometrics, vol. 42, no. 1, pp. 55–61, 2000.
[35] M. D. Morris and T. J. Mitchell, “Exploratory designs for computational experiments,” Journal of statistical planning and inference, vol. 43, no. 3, pp. 381–402, 1995.
[36] J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of International Conference on Neural Networks, vol. 4. IEEE, 1995, pp. 1942–1948.
[37] Q. Zhou, P. Z. Qian, and S. Zhou, “A simple approach to emulation for computer models with qualitative and quantitative factors,” Technometrics, vol. 53, no. 3, pp. 266–273, 2011.
[38] F. Kursawe, “A variant of evolution strategies for vector optimization,” in International conference on parallel problem solving from nature. Springer, 1990, pp. 193–197.
[39] D. Zhan and H. Xing, “Expected improvement for expensive optimization: a review,” Journal of Global Optimization, vol. 78, no. 3, pp. 507–544, 2020.
[40] J. K. Chen, R.-B. Chen, A. Fujii, R. Suda, and W. Wang, “Surrogate-assisted tuning for computer experiments with qualitative and quantitative parameters,” Statistica Sinica, pp. 761–789, 2018.
[41] T. Wagner, M. Emmerich, A. Deutz, and W. Ponweiser, “On expected-improvement criteria for model-based multi-objective optimization,” in Proceedings of the 11th International Conference on Parallel Problem Solving from Nature: Part I. Springer, 2010, p. 718–727.
[42] R. T. Marler and J. S. Arora, “The weighted sum method for multi-objective optimization: new insights,” Structural and multidisciplinary optimization, vol. 41, pp. 853–862, 2010.
[43] M. Li, S. Yang, and X. Liu, “A performance comparison indicator for pareto front approximations in many-objective optimization,” in Proceedings of the 2015 annual conference on genetic and evolutionary computation, 2015, pp. 703–710.
[44] N. Riquelme, C. Von Lücken, and B. Baran, “Performance metrics in multi-objective optimization,” in 2015 Latin American computing conference (CLEI). IEEE, 2015, pp. 1–11.
[45] H. Ishibuchi, H. Masuda, Y. Tanigaki, and Y. Nojima, “Modified distance calculation in generational distance and inverted generational distance,” in Evolutionary MultiCriterion Optimization: 8th International Conference. Springer, 2015, pp. 110–125.
校內:2028-09-14公開