| 研究生: |
林詩珊 Lin, Shih-shan |
|---|---|
| 論文名稱: |
數種解決全域最佳化的隨機演算法之計算複雜度分析 Computational Complexity for Various Types of Stochastic Algorithms for Global Optimization |
| 指導教授: |
許瑞麟
Sheu, Ruey-lin |
| 學位類別: |
碩士 Master |
| 系所名稱: |
理學院 - 數學系應用數學碩博士班 Department of Mathematics |
| 論文出版年: | 2008 |
| 畢業學年度: | 96 |
| 語文別: | 英文 |
| 論文頁數: | 46 |
| 中文關鍵詞: | 非齊次卜松過程 、記錄值 |
| 外文關鍵詞: | record values, nonhomogeneous Poisson processes, Pure Adaptive Search(PAS), Hesitant Adaptive Search(HAS), Pure Random Search(PRS) |
| 相關次數: | 點閱:101 下載:3 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本篇論文分析三種解決全域最佳化的隨機演算法,包括Pure Random
Search(PRS),Pure Adaptive Search(PAS),與Hesitant Adaptive Search(HAS)。令y
代表給定的目標值,另以N( y)代表演算法得到小於或等於y 值所需的計算次
數,則我們要關切的是期望值E[N( y)],此期望值可作為演算法計算複雜度之度
量指標。我們發現對於PRS,期望值E[N( y)]與p(y)成反比,其中p( y)是從可
行域任選一點其值小於或等於y的機率。對於PAS,其計算次數平均值是PRS
計算次數平均值取對數。而HAS的計算次數期望值是p(y)和b(y)的函數,其中
b(y)是從可行域任選一點其值嚴格小於y的機率。若b(y)等於1,則PAS變成是
HAS 的特例。
This thesis analyzes three stochastic algorithms
for global optimization on continuous domain, including the Pure Random Search (PRS), the
Pure Adaptive Search(PAS), and the Hesitant Adaptive Search(HAS). Let $overline{y}$ denote the given target value, and $N(overline{y})$ the number of iterations to achieve
a value less than or equal to $overline{y}$. We are concerned with the expectation
$E[N(overline{y})]$, a kind of measure for the computational complexity of an algorithm.
We found that for PRS, $E[N(overline{y})]$ is inversely proportional to
$p(overline{y})$ where $p(overline{y})$ is the probability to select a point with its
objective function value at most $overline{y}$. For PAS, $E[N(overline{y})]$ is the
logarithm of that for PRS. For HAS, the expected number is a function of
$p(overline{y})$ and $b(overline{y})$ where $b(y)$ is the probability of choosing a
point with the objective function value strictly less than $y$. When $b(overline{y})$
equals to one, the expected number coincides with that of PAS.
[1] J. C. Spall, Introduction to Stochatic Search and Optimizaiton. John Wiley & Sons, 2003.
[2] S. Kirkpatrick, C. D. Gelatt, Jr., M. P. Vecchi, ``Optimization by simulated annealing,'Science, vol. 220, pp. 671-680, 1983.
[3] Z. Michalewicz, Genetic Algorithms + Data Structures = Evolution Programs. Springer-Verlag, 1996.
[4] G. J. Koehler, ``New direction in genetic algorithm theory,' Annals of Operations Research, vol. 75, pp. 49-68, 1997.
[5] S. H. Brooks, ``A discussion of random methods for seeking maxima,'Operations research, vol. 6, pp. 244-251, 1958.
[6] Z. B. Zabinsky, Stochastic Adaptive Search for Global
Optimization. Kluwer Academic Publishers, 2003.
[7] S. Ross, Stochastic Processes. John Wiley & Sons, 1996.
[8] J. Galambos, The Asymptotic Theory of Extreme Order Statistics. John Wiley & Sons, 1978.
[9] E. Cinlar, Introduction to Stochastic Processes. Prentice-Hall,
1975.
[10] H. M. Taylor and S. Karlin, An Introduction to Stochastic
Modeling. Academic Press, 1998.
[11] J. F. C. Kingman, Poisson Processes. New
York: Oxford University Press, 1993.
[12] S. I. Birbil, and S. C. Fang, ``An electromagnetism-like mechanism for global optimization,' Journal of Global Optimization, vol. 25, pp. 263-282, 2003.
[13] S. I. Birbil, S. C. Fang, and R. L. Sheu, ``On the convergence of a population-based global optimization algorithm,' Journal of Global Optimization, vol. 30, pp. 301-318, 2004.
[14] Z. B. Zabinsky, R. L. Smith, , ``Pure adaptive search in global optimization,' Mathematical Programming, vol. 53, pp. 323-338, 1992.
[15] W. P. Baritompa, Zhang Baoping, R. H. Mladineo, G. R. Wood, and Z. B. Zabinsky, ``Towards pure adaptive search,' Journal of Global Optimization, vol. 7, pp. 93-110, 1995.
[16] G. R. Wood, Z. B. Zabinsky, B. P. Kristinsdottir, ``Hesitant adaptive search: the distribution of the number of iterations to convergence,' Mathematical Programming, vol. 89, pp. 479-486, 2001.
[17] R. Durrett, Probability: theory and examples. Duxbury Press, 1995.