簡易檢索 / 詳目顯示

研究生: 傅聖揚
Fu, Sheng-Yang
論文名稱: 考慮單一隨機限制式下之離散型模擬最佳化演算法
Discrete Optimization via Simulation Algorithm Considering Single Stochastic Constraint
指導教授: 蔡青志
Tsai, Shing-Chih
學位類別: 碩士
Master
系所名稱: 管理學院 - 工業與資訊管理學系
Department of Industrial and Information Management
論文出版年: 2011
畢業學年度: 99
語文別: 中文
論文頁數: 101
中文關鍵詞: 模擬最佳化排序和選擇程序隨機限制式基因演算法
外文關鍵詞: Optimization via simulation, Ranking and selection, Stochastic constraint, Genetic algorithm
相關次數: 點閱:135下載:5
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 模擬最佳化(Optimization via Simulation; OvS)是從許多不同的模擬系統中,找出績效表現最佳的系統,藉以分析現實生活中存在的複雜性隨機問題。其中,排序與選擇程序(Ranking and Selection; R&S)即是從有限且少量的模擬系統中,在信心水準之下,找出期望績效表現最佳的系統。從過去的研究中發現,排序與選擇程序的研究主要追求隨機目標最佳化,很少考量隨機限制式問題。因此,Andrad´ottir and Kim (2010)發展可行性判定程序(Feasibility Determination Procedure; FDP),在統計的理論基礎下,找出滿足隨機限制式的可行或接近可行的系統。然而,假若系統或隨機限制式數量過多,將造成實驗所需之抽樣數量提高,導致抽樣成本大幅增加。
    有鑒於此,Boesel and Nelson(2003)提出隨機基因演算法(Stochastic Genetic Algorithm; SGA),用以處理系統數量龐大下之隨機目標問題。本研究將此一問題延伸至具有隨機目標函數與限制式問題,發展一全新隨機限制化基因演算法(Stochastic Constrained Genetic Algorithm; SCGA)進行求解,並且證明當實驗抽樣數量足夠多時,其最佳解滿足全域收斂性。由於基因演算法其收斂速率緩慢,本研究亦提出另一啟發式基因演算法,透過分群程序與限制式處理技巧相結合,使得在時間的限制下,能夠以系統化的方式找出最佳近似解,但沒辦法保證其全域收斂性。

    Optimization via simulation (OvS) is the process of optimizing the expected performance of a discrete event, stochastic system through computer simulation. Ranking and Selection (R&S) is one branch of OvS, which mainly focuses on finding the best among a finite number of simulated alternatives with respect to a expected performance measure. In most relevant literature of R&S, however, only one performance measure is considered and very little work has been done in handling stochastic constraints. Andradottir and Kim (2010) present a feasibility determination procedure that determines the feasibility of candidate systems in the presence of single stochastic constraint (secondary performance measure), and finds the best feasible system based on a primary performance measure. However, if the number of stochastic constraints or simulated systems are too large, the feasibility determination procedure will become inefficient (i.e., incurring prohibitive sampling cost).
    To solve a problem with a stochastic objective function and huge solution space, Boesel and Nelson (2003) present Stochastic Genetic Algorithm (SGA) to control statistical error within a heuristic search procedure. We extend their procedure to be able to solve the problem with single stochastic objective along with single stochastic constraint. We provide an algorithm called Stochastic Constrained Genetic Algorithm (SCGA), which guarantees global convergence as the simulation effort goes to infinity. Due to the disadvantage of slow convergence rate that is inherent in GA, we provide another heuristic algorithm which combines grouping procedure and constraint handling technique to be able to find near-optimal solution in a reasonable amount of time, though it cannot guarantee to attain global convergence.

    中 文 摘 要 i 英 文 摘 要 ii 誌 謝 iv 目 錄 v 圖 目 錄 ix 表 目 錄 xi 第一章 緒論 1 1.1研究背景與動機 1 1.2研究目的 4 1.3論文架構 5 第二章 文獻探討 6 2.1模擬最佳化(Optimization via Simulation;OvS) 6 2.1.1排序與選擇程序(RankingandSelection;R&S 8 2.1.1.1無差異程序 8 2.1.1.2子集合選擇程序 9 2.1.1.3可行性判定程序 11 2.1.2離散型模擬最佳化(DiscreteOvS;DOvS) 14 2.2分群程序 16 2.3限制式處理技巧 18 2.4小結 22 第三章 研究方法 24 3.1研究問題與假設 24 3.2 Globally Convergent Constrained DOvS Algorithms(GCDA) 26 3.2.1研究架構 26 3.2.1.1篩選程序階段 28 3.2.1.2全域搜尋階段(SCGA) 29 3.2.1.3 SCGA全域收斂性證明 37 3.2.1.4選擇程序階段 37 3.2.1.5 GCDA流程架構 37 3.3 Heuristic Constrained DOvS Algorithms(HCDA) 39 3.3.1研究架構 39 3.3.1.1建構初始群體 41 3.3.1.2解績效的評估方式 41 3.3.1.3分群程序 41 3.3.1.4指派適應值 41 3.3.1.5母代選取 42 3.3.1.6交配運算子 42 3.3.1.7突變運算子 43 3.3.1.8 Elite群體 43 3.3.1.9子代群體形成 44 3.3.1.10停止條件 44 3.3.1.11可行性判定 44 3.3.1.12 R&S 44 3.3.1.13 HCDA流程架構 44 第四章 實驗設計與分析 46 4.1實驗評估 46 4.1.1多峰函數(Multimodal function)問題 48 4.1.2奇異函數(Singular function)問題 49 4.1.3參數設定 50 4.2實驗結果 53 4.2.1多峰問題-可行解比率高、變異程度低 53 4.2.2多峰問題-可行解比率高、變異程度高 56 4.2.3多峰問題-可行解比率適中、變異程度低 59 4.2.4多峰問題-可行解比率適中、變異程度高 62 4.2.5多峰問題-可行解比率低、變異程度低 65 4.2.6多峰問題-可行解比率低、變異程度高 68 4.2.7奇異問題-可行解比率高、變異程度低 71 4.2.8奇異問題-可行解比率高、變異程度高 74 4.2.9奇異問題-可行解比率適中、變異程度低 77 4.2.10奇異問題-可行解比率適中、變異程度高 80 4.2.11奇異問題-可行解比率低、變異程度低 83 4.2.12奇異問題-可行解比率低、變異程度高 86 第五章 結論與未來研究方向 89 5.1論文總結與貢獻 89 5.2未來研究方向 94 附錄 95 參考文獻 97 圖目錄 圖2.1模擬最佳化架構圖 7 圖2.2分群程序演算法 18 圖3.1服務水準函數 25 圖3.2隨機取樣式選擇 32 圖3.3黃金分割搜尋法步驟一 35 圖3.4黃金分割搜尋法步驟二 35 圖3.5黃金分割搜尋法步驟三 35 圖3.6黃金分割搜尋法步驟四 36 圖3.7 GCDA流程架構圖 38 圖3.8 HCDA流程架構圖 45 圖4.1績效表現圖-多峰問題(可行解比率高、變異程度低)(預算準則) 54 圖4.2績效表現圖-多峰問題(可行解比率高、變異程度高)(預算準則) 57 圖4.3績效表現圖-多峰問題(可行解比率適中、變異程度低)(預算準則) 60 圖4.4績效表現圖-多峰問題(可行解比率適中、變異程度高)(預算準則) 63 圖4.5績效表現圖-多峰問題(可行解比率低、變異程度低)(預算準則) 66 圖4.6績效表現圖-多峰問題(可行解比率低、變異程度高)(預算準則) 69 圖4.7績效表現圖-奇異問題(可行解比率高、變異程度低)(預算準則) 72 圖4.8績效表現圖-奇異問題(可行解比率高、變異程度高)(預算準則) 75 圖4.9績效表現圖-奇異問題(可行解比率適中、變異程度低)(預算準則) 78 圖4.10績效表現圖-奇異問題(可行解比率適中、變異程度高)(預算準則) 81 圖4.11績效表現圖-奇異問題(可行解比率低、變異程度低)(預算準則) 84 圖4.12績效表現圖-奇異問題(可行解比率低、變異程度高)(預算準則) 87 表目錄 表3.1 SUS結果 31 表4.1 GCDA參數設定 51 表4.2 HCDA參數設定 51 表4.3 FSCA參數設定 52 表4.4 AK參數設定 52 表4.5多峰問題參數設定 52 表4.6奇異問題參數設定 52 表4.7多峰問題-可行解比率高、變異程度低(預算準則) 55 表4.8多峰問題-可行解比率高、變異程度低(改善準則) 56 表4.9多峰問題-可行解比率高、變異程度高(改善準則) 57 表4.10多峰問題-可行解比率高、變異程度高(預算準則) 58 表4.11多峰問題-可行解比率適中、變異程度低(改善準則) 60 表4.12多峰問題-可行解比率適中、變異程度低(預算準則) 61 表4.13多峰問題-可行解比率適中、變異程度高(改善準則) 62 表4.14多峰問題-可行解比率適中、變異程度高(預算準則) 64 表4.15多峰問題-可行解比率低、變異程度低(改善準則) 66 表4.16多峰問題-可行解比率低、變異程度低(預算準則) 67 表4.17多峰問題-可行解比率低、變異程度高(改善準則) 68 表4.18多峰問題-可行解比率低、變異程度高(預算準則) 70 表4.19奇異問題-可行解比率高、變異程度低(改善準則) 72 表4.20奇異問題-可行解比率高、變異程度低(預算準則) 73

    Alrefaei, M., Andrad´ottir, S., 1999. A simulated annealing algorithm with constant temperature for discrete stochastic optimization. Management Science 45, 748-764.

    Alrefaei, M., Andrad´ottir, S., 1999. A modification of the stochastic ruler method for discrete stochastic optimization. European Journal of Operation Research 133, 160-182.

    Andrad´ottir, S., 1995. A method for discrete stochastic optimization. Management Science 41, 1946-1961.

    Andrad´ottir, S., Kim, S. H., 2010. Fully sequential procedures for comparing con¬strained systems via simulation. Naval Research Logistics 57, 403-421.

    Atlason, Epelman, J. M., Henderson, S. G., 2004. Call center staffing with simulation and cutting plane methods. Annals of Operation Research 127, 333-358.

    Avriel, M., 1976. Nonlinear programming: Analysis and Methods. Englewood Cliffs.

    N.J. Baker, J. E., 1987. Reducing bias and inefficiency in the selection algorithm. In Proceedings of the 2nd International Conference on Genetic Algorithms. 14-21.

    Barton, R. R., Meckesheimer, M., 2006. Metamodel-Based simulation optimization. Handbook in OR.

    Batur, D., Kim, S. H., 2008. Finding feasible systems in the presence of constraints on multiple performance measures. ACM Transactions on Modeling and Computer Simulation 20, Article 13.

    Bazaraa, M. S., H.D. Sherali, C.M. Shetty, 2006. Nonlinear programming :theory and algorithms. John Wiley and Sons, New York.

    Bechhofer, R. E., 1954. A single-sample multiple decision procedure for ranking means of normal populations with known variances. Annals of Mathematical Statistics 25, 16-39.

    Boesel, J. 1999. Search and selection for large-scale stochastic optimization. Doctor dissertation, Department of IEMS, Northwestern University, Evanston, IL.

    Boesel, J., Nelson, B. L., Ishii, N., 2003. A framework for simulation-optimization software. IIE Transactions 35, 221-230.

    Boesel, J., Nelson, B. L., Kim S.-H., 2003. Using ranking and selection to ”clean up” after simulation optimization. Operations Research 51, 814-825.

    Cezik, M. T., L'Ecuyer, P., 2008. Staffing multiskill call centers via linear programming and simulation. Operations Research 54, 310-323.

    Chen, S. Y., Dajan , S. D., 1998. Improving the efficiency of genetic algorithms for frame designs. Engineering Optimization 30, 281-307.

    Coello, C. A. C., 2002. Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: A survey of the state of the art. Computer Methods in Applied Mechanics and Engineering 191, 1245-1287.

    De Jong, K. A., 1975. An Analysis of the Behavior of a Class of Genetic Adaptive Systems. Doctoral Dissertation, The University of Michigan.

    Fu, M. C., 2002. Optimization for simulation: Theory vs. Practice. INFORMS Journal on Computing 14, 192-215.

    Gong, W. B., Ho, Y. C., Zhai, W., 1999. Stochastic comparison algorithm for discrete optimization with estimation. SIAM Journal on Optimization 10, 384-
    404.

    Hadj-Alouane, A. B., Bean, J. C., 1997. A genetic algorithm for the multiple-choice integer program, Operations Research 45, 92-101.

    Herrera, F., Lozano, M., Verdegay, J. L., 1998. A learning process for fuzzy control rules using genetic algorithms. Fuzzy Sets and Systems 100, 143-158.

    Homaifar, A., Lai, S. H. Y., Qi, X., 1994. Constrained optimization via genetic algorithms. Simulation 62, 242-254.

    Hong, L. J., Nelson, B. L., 2006. Discrete optimization via simulation using COMPASS. Operations Research 54, 115-129.

    Hong, L. J., Nelson, B. L., 2009. A Brief Introduction to Optimization via Simulation. Winter Simulation Conference Proceedings, 75-85.

    Kim, S. H., Nelson, B. L., 2001. A fully sequential procedure for indifference-zone selection in simulation, ACM Transactions on Modeling and Computer Simulation 11, 251-273.

    Kim, S. H., Nelson, B. L., 2005. Selecting the best system,Chapter 17 in Elsevier Handbooks in Operations Research and Management Science: Simulation, Elsevier.

    Kim, S. H., 2005. Comparison with a standard via fully sequential procedures, ACM Transactions on Modeling and Computer Simulation 15, 1-20.

    Lee, L. H., Lee, C. U., Tan Y. P., 2007. A multi-objective genetic algorithm for robust flight scheduling using simulation, European Journal of Operation Research 177, 1948-1968.

    Michalewicz, Z., 1995. Genetic algorithms, numerical optimization and constraints. Proceedings of the Sixth International Conference on Genetic Algorithms, 151-
    158.

    Michalewicz, Z., 1996. Genetic Algorithms + Data Structures = Evolution Algorithms 3rd Ed. Springer, Berlin.

    Nelson, B. L., Swann, J., Goldsman, D., Song, W., 2001. Simple procedures for selecting the best simulated system when the number of alternatives is large. Operations Research 49, 950-963.

    Olafsson, S., 2006. Metaheuristics. Handbooks in Operations Research and Management Science VII, 633-654.

    Pichitlamken, J., Nelson, B. L., 2003. A combined procedure for optimization via simulation. ACM Transactions on Modeling and Computer Simulation 13, 155-179.

    Pichitlamken, J., Nelson, B. L., 2006. A sequential procedure for neighborhood selection-of-the-best in optimization via simulation, European Journal of Operation Research 173, 283-298.

    Ray, T., Kang, S. K., 2000. An evolutionary algorithm for constrained optimization, in: D. Whitley, D. Goldberg, E. Cant´u-(GECCP’2000), Morgan Kaufmann, San Francisco, CA, 771-777.

    Rinott, Y., 1978. On two-stage selection procedures and related probability-inequalities. Communications in Statistics -Theory and Methods 7, 799-811.

    Runarsson, T. P., Yao, X., 2000. Stochastic ranking for constrained evolutionary optimization. IEEE Transactions on Evolutionary Computation 4, 284-294.

    Shi, L., S. Olafsson., 2000. Nested partitions method for stochastic optimization. Methodology and Computing in Applied Probability 2, 271-291.

    Xu, J., Hong, L. J., Nelson, B. L., 2010. Industrial strength COMPASS: A comprehensive algorithm and software for optimization via simulation. ACM Transactions on Modeling and Computer Simulation 20, 1-29.

    下載圖示 校內:2020-01-01公開
    校外:2020-01-01公開
    QR CODE