簡易檢索 / 詳目顯示

研究生: 陸子強
Lu, Tzyy-Chyang
論文名稱: 量子進化演算法於全域最佳化之研究
A Study on Quantum Evolutionary Algorithm for Global Optimization
指導教授: 莊智清
Juang, Jyh-Ching
學位類別: 博士
Doctor
系所名稱: 電機資訊學院 - 電機工程學系
Department of Electrical Engineering
論文出版年: 2011
畢業學年度: 100
語文別: 英文
論文頁數: 115
中文關鍵詞: 量子進化演算法最佳化
外文關鍵詞: Quantum Evolutionary Algorithm, Optimization
相關次數: 點閱:61下載:6
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本論文利用量子位元的特性,分別提出四種量子進化演算法以解決不同的最佳化問題。首先,對於含有離群資料的建模問題,本論文提出利用結合量子進化演算法與類神經網路的技巧,以實現即時的資料訓練與除錯。模擬部份則以建立壓縮機特性曲線為例,驗證所提演算法的可行性。對於數值最佳化問題,本論文提出將解空間分割,並利用量子位元來表達各個解空間的選取機率。在所提的方法中,量子位元將不斷進化,使得好的子空間被選取的機率不斷提升。經由模擬實驗證實,本論文所提的演算法相較於其他演算法更具有全域搜尋的能力。此外,對於更為複雜的函數最佳化問題,本論文進一步提出量子位元重疊策略,使量子位元間藉由合作以加速最佳解空間的搜尋並避免早熟現象。最後,對於分類問題,本論文提出利用量子位元來建構最佳的類神經網路,以提高分類的準確度。在所提的方法中,網路結構是以機率性質的量子位元來表示,因此除可改善常見的映射問題外,亦可避免演化過程中提早捨棄一些具有潛力網路結構的風險。此外,量子位元亦將權值空間分割成數個子空間,並藉由“跳躍”方式來加速最佳權值的搜尋。模擬部份則以四項實際數據為例,以驗證所提演算法的優越性。

    In this dissertation, quantum evolutionary algorithm (QEA) is employed to solve various global optimization problems. Firstly, a structure that combines neural networks and QEA, called the NN-QEA, is proposed to establish a nonlinear map when data are subject to outliers. The effectiveness and the applicability of the NN-QEA are demonstrated by modeling the compressor characteristic map. Secondly, a region-based QEA, called the RQEA, for solving numerical optimization problems is proposed. In the proposed algorithm, the feasible solution space is decomposed into regions through quantum bits. As the search progresses from one generation to the next, the quantum bits evolve gradually to increase the probability of selecting regions that render good fitness values. The RQEA is applied to a series of numerical optimization problems. The simulations show that the results obtained by the RQEA are better than those obtained using state-of-the-art QEA and DEahcSPX. Thirdly, to prevent a premature convergence and to speed up the overall search speed in solving complex numerical optimization problems, an overlapping-based QEA called the QSSA is proposed to encode the regions. In the quantum space, each group comprises several regions, so it is more likely to find the optimal region. Fourthly, a quantum-based algorithm for designing artificial neural network, called QNN, is proposed. The aim is to design an ANN (Artificial neural network) with few connections and high classification performance by simultaneously optimizing the network structure and the connection weights. QNN uses probabilistic quantum bit representation to codify the network. As a result, the connectivity bits do not indicate the actual links but the probability of the existence of the connections, thus alleviating mapping problems and reducing the risk of throwing away a potential candidate. In addition, in the proposed model, each weight space is decomposed into subspaces in terms of quantum bits. Thus, the algorithm performs a “jumping” and “region by region” exploration, and evolves gradually to find promising subspaces for further exploitation. This is helpful for speeding up the evolution of the network and for avoiding premature convergence. The proposed model is tested on four benchmark problems, namely iris, breast cancer, heart, and diabetes problems. The simulation results show that the proposed algorithm can produce compact ANN structures with good generalization ability compared to other algorithms.

    摘要 I Abstract II Acknowledgement IV Contents V List of Figures VII List of Tables IX List of Abbreviation X Chapter 1 Introduction 1 1.1 Motivation 1 1.2 Review of QEA 2 1.2.1 Representation and Observation 2 1.2.2 Rotation Gate 4 1.3 Dissertation Contribution 5 1.4 Dissertation Organization 7 Chapter 2 On-line Outliers Detection by Neural Network with Quantum Evolutionary Algorithm 8 2.1 Introduction 8 2.2 NN-QEA 9 2.3 An Application Example 11 2.4 Summary 15 Chapter 3 Quantum Evolutionary Algorithm for Global Numerical Optimization—Part I: A Region-based Strategy 16 3.1 Introduction 16 3.2 RQEA for Global Numerical Optimization 18 3.2.1 Problem Definition 18 3.2.2 Partition of Solution Space 19 3.2.3 Gaussian Mutation 21 3.2.4 RQEA for Numerical Optimization 22 3.3 Numerical Simulations and Results 23 3.3.1 Benchmark Functions 23 3.3.2 Parameter Settings 23 3.3.3 Results of RQEA 24 3.3.4 Scalability Study 28 3.4 Summary 30 Chapter 4 Quantum Evolutionary Algorithm for Global Numerical Optimization—Part II: An Overlapping Strategy 32 4.1 Introduction 32 4.2 Quantum-inspired Space Search Algorithm (QSSA) 33 4.2.1 Overlapping Strategy and Quantum Bit Representation 33 4.2.2 Quantum-Inspired Space Search Algorithm 37 4.3 Numerical Simulations and Results 45 4.3.1 Parameter Settings 45 4.3.2 Simulation Results and Comparisons 47 4.3.3 Partition Quality Study 56 4.3.4 Scalability Study and Comparisons 58 4.4 Summary 62 Chapter 5 A Quantum-Based Algorithm for Evolving Artificial Neural Networks 64 5.1 Introduction 64 5.2 QNN 68 5.2.1 Network Architecture 68 5.2.2 Objective Function 69 5.2.3 Representation of Solutions 70 5.2.4 Quantum-Based Algorithm 71 5.3 Numerical Simulations and Results 78 5.3.1 Simulation Parameters 78 5.3.2 Simulation Results and Comparisons 79 5.3.3 Analysis of Weight Space Partitioning 88 5.4 Summary 89 Chapter 6 Conclusions 90 6.1 Conclusions 90 6.2 Recommendation for Future Researches 91 References 92 Appendix A 99 Appendix B 100

    [1] M. M. Ali and P. Kaelo, “Improved particle swarm algorithms for global optimization,” Appl. Math. Comput., vol. 196, no. 2, pp. 578-593, 2008.
    [2] P. J. Angeline, G. M. Sauders, and J. B. Pollack, “An evolutionary algorithm that constructs recurrent neural networks,” IEEE Trans. Neural Netw., vol. 5, pp. 54-65, Jan. 1994.
    [3] A. Auger and N. Hansen, “A restart CMA evolution strategy with increasing population size,” in Proc. of the 2005 Congr. on Evol. Comput., pp. 1769-1776, Los Alamitos, USA, 2005.
    [4] T. , Evolutionary Algorithms in Theory and Practice. New York: Oxford Univ. Press, 1996.
    [5] G. Bebis, M. Georgiopoulos, and T. Kasparis, “Coupling weight elimination with genetic algorithms to reduce network size and preserve generalization,” Neurocomputing, vol. 17, pp. 167-194, 1997.
    [6] R. K. Belew, J. McInerney, and N. N. Schraudolph, “Evolving networks: Using genetic algorithm with connectionist learning,” Comput. Sci. Eng. Dep. (C-014), Univ. of California, San Diego, Tech. Rep. CS90-174, Feb. 1991.
    [7] H. K. Birru, K. Chellapilla, and S. S. Rao, “Local search operators in fast evolutionary programming,” in Proc. of the 1999 Congr. on Evol. Comput., vol. 2, pp. 1506-1513, Washington D.C., USA, Jul. 1999.
    [8] E. Cantú-Paz and C. Kamath, “Inducing oblique decision trees with evolutionary algorithms,” IEEE Trans. Evol. Comput., vol. 7, pp. 54-68, Feb. 2003.
    [9] M. N. A., C. Claudio Pereira, and M. F. Lapa, “Parallel island genetic algorithm applied to a nuclear power plant auxiliary feed water system surveillance tests policy optimization,” Annals of Nuclear Energy, pp. 1665-1675, 2003.
    [10] G. Dick, “The spatially-dispersed genetic algorithm: an explicit spatial population structure for GAs,” in Proc. of the 2003 Congr. on Evol. Comput., pp. 2455-2461, Canberra, Australia, 2003.
    [11] J. Fang and Y. Xi, “Neural network design based on evolutionary programming,” Artificial Intell. Eng., vol. 11, no. 2, pp. 155-161, 1997.
    [12] D. B. Fogel, “An introduction to simulated evolutionary optimization,” IEEE Trans. Neural Netw., vol. 5, pp. 3-14, Jan. 1994.
    [13] N. Garcia-Pedrajas, C. Hervas-Martinez, and D. Ortiz-Boyer, “Cooperative coevolution of artificial neural networks ensembles for pattern classification,” IEEE Trans. Evol. Comput., vol. 9, no. 3, pp. 271-302, Jun. 2005.
    [14] C. K. Goh, E. J. Teoh, and K. C. Tan, “Hybrid multiobjective evolutionary design for artificial neural networks,” IEEE Trans. Neural Netw., vol. 19, no. 9, pp. 1531-1548, Sep. 2008.
    [15] J. T. Gravdahl, and O. Egeland, “Centrifugal compressor surge and speed control,” IEEE Trans. Contr. Syst. Technol., vol. 7, no. 5, pp. 567-579, 1999.
    [16] K. H. Han and J. H. Kim, “Quantum-inspired evolutionary algorithm for a class of combinatorial optimization,” IEEE Trans. Evol. Comput., vol. 6, pp. 580-593, Dec. 2002.
    [17] K. H. Han and J. H. Kim, “Quantum-inspired evolutionary algorithms with a new termination criterion, gate, and two-phase scheme,” IEEE Trans. Evol. Comput., vol. 8, pp. 156-169, Apr. 2004.
    [18] K. H. Han and J. H. Kim, “On the analysis of the quantum-inspired evolutionary algorithm with a single individual,” in Proc. of the 2006 Congr. on Evol. Comput., pp. 2622-2629, Vancouver, Jul. 2006.
    [19] P. J. B. Hancock, “Genetic algorithms and permutation problems: A comparison of recombination operators for neural net structure specification,” in Proc. of the Int. Workshop Combinations of Genetic Algorithms and Neural Netw. (COGANN-92), D. Whitley and J. D. Schaffer, Eds. Los Alamitos, CA: IEEE Computer Soc., 1992, pp. 108-122.
    [20] N. Hansen, S. D. Muller, and P. Koumoutsakos, “Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES),” Evol. Comput., vol. 11, no. 1, pp. 1-18, 2003.
    [21] M. W. He, G. H. Li, and B. Y. Ruan, “Application of modified quantum genetic algorithm in optimization of multi-peak functions,” Computer Engineering and Applications, vol. 44, no. 7, pp. 41-43, 2008.
    [22] J. Herrera, E. Huedo, R. S. Montero, and I. M. Llorente, “A grid-oriented genetic algorithm,” in 3rd European Grid Conference, vol. 3470, pp. 315-322, 2005.
    [23] Q. Hongjian, Z. Dawei, and Z. Fangzhao, “A new quantum clone evolutionary algorithm for multi-objective optimization,” in Proc. of the 2008 International Seminar on Business and Information Management, pp. 23-25, 2008.
    [24] M. M. Islam, M. A. Sattar, M. F. Amin, X. Yao, and K. Murase, “A new adaptive merging and growing algorithm for designing artificial neural networks,” IEEE Trans. on Syst., Man and Cybern., vol. 39, no. 3, pp. 705-722, Jun. 2009.
    [25] M. M. Islam, M. A. Sattar, M. F. Amin, X. Yao, and K. Murase, “A new constructive algorithm for architectural and functional adaptation of artificial neural networks,” IEEE Trans. on Syst., Man and Cybern., vol. 39, no. 6, pp. 1590-1605, Dec. 2009.
    [26] M. M. Islam, X. Yao, and K. Murase, “A constructive algorithm for training cooperative neural network ensembles,” IEEE Trans. Neural Netw., vol. 14, pp. 820-834, Jul. 2003.
    [27] D. Jiang, Z. Wu, J. Zou, M. Wei, and L. Kang, “Algorithm based on heuristic subspace searching strategy for solving investment portfolio optimization problems,” in Proc. of the 2008 Congr. on Evol. Comput., pp. 607-611, Hong Kong, Jun. 2008.
    [28] J. Kennedy and R. C. Eberhart, “Particle swarm optimization,” in Proc. of the IEEE Int. Conf. Neural Netw., pp. 1942-1948, Perth, Australia, Dec. 1995.
    [29] S. Kimura and A. Konagaya, “High dimensional function optimization using a new genetic local search suitable for parallel computers,” in Proc. of the IEEE Int. Conf. Syst., Man, and Cybern., vol. 1, pp. 335-342, Washington, D.C., Oct. 2003.
    [30] P. Kohn, “Combining genetic algorithms and neural networks,” M.S. thesis, Dept. Comput. Sci., Univ. Tenessee, Knoxville, TN, 1994.
    [31] N. Krasnogor and J. E. Smith, “A tutorial for competent memetic algorithms: model, taxonomy, and design issue.” IEEE Trans. Evol. Comput., vol. 9, no. 5, pp. 474-488, Oct. 2005.
    [32] F. H. F. Leung, H. K. Lam, S. H. Ling, and P. K. S. Tam, “Tuning of the structure and parameters of a neural network using an improved genetic algorithm,” IEEE Trans. Neural Netw., vol. 14, no. 1, pp. 79-88, Jan. 2003.
    [33] Y. W. Leung and Y. Wang, “An orthogonal genetic algorithm with quantization for global numerical optimization,” IEEE Trans. Evol. Comput., vol. 5, pp. 41-53, Feb. 2001.
    [34] L. Li and B. Niu, “A hybrid evolutionary system for designing artificial neural networks,” in Proc. of the Int. Conf. on Computer Science and Software Engineering, vol. 4, pp. 859-862, Wuhan, China, Dec. 2008.
    [35] K. Liano, “Robust error measure for supervised neural network learning with Outliers,” IEEE Trans. Neural Netw., vol. 7, no. 1, pp. 246-250, 1996.
    [36] D. Lim, Y. S. Ong, Y. Jin, B. Sendhoff, and B. S. Lee, “Efficient hierarchical parallel genetic algorithms using grid computing,” Future Generation Computer Systems, vol. 23, no. 4, pp. 658-670, May. 2007.
    [37] J. Liu and P. Gader, “Neural networks with enhanced outlier rejection ability for off-line handwritten word recognition,” Pattern Recognition, vol. 35, no. 10, pp. 2061-2071, 2002.
    [38] B. Liu, L. Wang, and Y. H. Jin, “An effective PSO-based memetic algorithm for flow shop scheduling,“ IEEE Trans. on Syst., Man and Cybern., vol. 37, no. 1, pp. 18-27, Feb. 2007.
    [39] Y. Liu, X. Yao, and T. Higuchi, “Evolutionary ensembles with negative correlation learning,” IEEE Trans. Evol. Comput., vol. 4, no. 4, pp. 380-387, Nov. 2000.
    [40] H. Liu, G. Zhang, C. Liu, and C. Fang, “A novel memetic algorithm based on real-observation quantum-inspired evolutionary algorithms,” in Proc. of the 2008 Conf. on Intell. Syst. and Knowl. Eng. (ISKE 2008), vol. 1, pp. 486-490, Xiamen, China, Nov. 2008.
    [41] T. C. Lu and J. C. Juang, “Quantum-inspired space search algorithm (QSSA) for Global Numerical Optimization,” Appl. Math. Comput., vol. 218, no. 6, pp. 2516-2532, 2011.
    [42] T. C. Lu and J. C. Juang, “An evolutionary space search algorithm (ESSA) for Global Numerical Optimization,” in Proc. of the IEEE Conf. on Decision and Control, Shanghai, China, 2009.
    [43] T. C. Lu, J. C. Juang, and Gwo-Ruey Yu, “On-line outliers detection by neural network with quantum evolutionary algorithm,” in Proc. of the Int. Conf. on Innovative Computing, Information and Control (ICICIC), Kumamoto, Japan, 2007.
    [44] T. B. Ludermir and C. Zanchettin, “An optimization methodology for neural network weights and architectures,” IEEE Trans. Neural Netw., vol. 17, no. 6, pp. 1452-1459, Nov. 2006.
    [45] L. Ma and K. Khorasani, “Constructive feedforward neural networks using Hermite polynomial activation functions,” IEEE Trans. Neural Netw., vol. 16, no. 4, pp. 821-833, Jul. 2005.
    [46] V. Maniezzo, “Genetic evolution of the topology and weight distribution of neural networks,” IEEE Trans. Neural Netw., vol. 5, pp. 39-53, 1994.
    [47] G. F. Miller, P.M. Todd, and S. U. Hedge, “Designing neural networks,” Neural Netw., vol. 4, pp. 53-60, 1991.
    [48] D. Molina, M Lozano, and F. Herrera, “Memetic algorithm with local search chaining for large scale continuous optimization problems,” in Proc. of the 2009 Congr. on Evol. Comput., pp. 830-837, Trondheim, Norway, 2009.
    [49] T. Nakashima, T. Ariyama, T. Yoshida, and H. Ishibuchi, “Performance evaluation of combined cellular genetic algorithms for function optimization problems,” in Proc. of the 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation, pp. 295-299, Kobe, Japan, Jul. 2003.
    [50] A. Narayanan and M. Moore, “Quantum inspired genetic algorithm,” in Proc. of the 1996 Congr. on Evol. Comput., pp. 61-66, New York, USA, 1996.
    [51] F. Neri and V. Tirronen, “Scale factor local search in differential evolution,” Memetic Computing Jour., vol. 1, no. 2, pp. 153-171, Jun. 2009.
    [52] N. Nikolaev and H. Iba, “Learning polynomial feedforward neural networks by genetic programming and backpropagation,” IEEE Trans. Neural Netw., vol. 14, no. 2, pp. 337-350, Mar. 2003.
    [53] N. Noman and H. Iba, “Accelerating differential evolution using an adaptive local search,” IEEE Trans. Evol. Comput., vol. 12, no. 1, pp. 107-125, Feb. 2008.
    [54] R. Parekh, J. Yang, and V. Honavar, “Constructive neural-network learning algorithms for pattern classification,” IEEE Trans. Neural Netw., vol. 11, pp. 436-450, Mar. 2000.
    [55] L. Prechelt, “PROBEN1–A set of neural network benchmark problems and benchmarking rules,” Technical Report, Faculty Informatics, Karlsruhe University, Karlsruhe, Germany, 1994.
    [56] J. C. F. Pujol and R. Poli, “Evolving the topology and the weights of neural networks using a dual representation,” Appl. Intell., vol. 8, no. 1, pp. 73-84, 1998.
    [57] C. Qin, Y. Liu, and J. Zheng, ”A real-coded quantum-inspired evolutionary algorithm for global numerical optimization,” in Proc. of the IEEE Int. Conf. Cybern. and Intell. Syst., pp. 1160-1164, Singapore, Sep. 2008.
    [58] A. K. Qin and P. N. Suganthan, “Self-adaptive differential evolution algorithm for numerical optimization,” in Proc. of the 2005 Congr. on Evol. Comput., vol. 2, pp. 1785-1791, Los Alamitos, USA, 2005.
    [59] R. Reed, “Pruning algorithms–A survey,” IEEE Trans. Neural Netw., vol. 4, pp. 740-747, Jul. 1993.
    [60] J. Renders and H. Bersini, “Hybridizing genetic algorithms with hill-climbing methods for global optimization: two possible ways,” in Proc. of the 1994 Congr. on Evol. Comput., pp. 312-317, Orlando, FL, 1994.
    [61] J. Rönkkönen, S. Kukkonen, and K. Price, “Real-parameter optimization with differential evolution,” in Proc. of the 2005 Congr. on Evol. Comput., vol. 1, pp. 567-574, Los Alamitos, USA, 2005.
    [62] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning internal representations by error propagation,” in Parallel Distributed Processing, D. E. Rumelhart and J. L. McClelland, Eds. Cambridge, MA: MIT Press, 1986, vol. 1, pp. 318-362.
    [63] J. Sarma and K. D. Jong “An analysis of local selection algorithms in a spatially structured evolutionary algorithm,” in Proc. of the 7th Int. Conf. Genetic Algorithms, pp. 181-186, Michigan, USA, 1997.
    [64] J. D. Schaffer, L. D. Whitley, and L. J. Eshelman, “Combinations of genetic algorithms and neural networks: A survey of the state of the art,” in Proc. of the Int. Workshop Combinations Genetic Algorithms Neural Netw., L. D. Whitley and J. D. Schaffer, Eds., Los Alamitos, CA, 1992, pp. 1-37.
    [65] P. N. Suganthan, N. Hansen, J. J. Liang, K. Deb, Y. P. Chen, A. Auger, and S. Tiwari, “Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization,” Technical Report, Nanyang Technological University, Singapore, 2005, http://www.ntu.edu.sg/home/EPNSugan.
    [66] H. H. Thodberg, “Improving generalization of neural networks through pruning,” Int. J. Neural Syst., vol. 1, no. 4, pp. 317-326, 1991.
    [67] V. Tirronen, F. Neri, T. Karkkainen, K. Majava, T. Rossi, “An enhanced memetic differential evolution in filter design for defect detection in paper production,” Evol. Comput., MIT Press, vol. 16, no. 4, pp. 529-555, 2008.
    [68] J. Tsai, J. Chou, and T. Liu, “Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm,” IEEE Trans. Neural Netw., vol. 17, no. 1, pp. 69-80, Jan. 2006.
    [69] J. T. Tsai, T. K. Liu, and J. H. Chou, “Hybrid Taguchi-genetic algorithm for global numerical optimization,” IEEE Trans. Evol. Comput., vol. 8, pp. 365-377, Aug. 2004.
    [70] Z. Tu and Y. Lu, “A robust stochastic genetic algorithm (StGA) for global numerical optimization,” IEEE Trans. Evol. Comput., vol. 8, no. 5, pp. 456-470, 2004.
    [71] L. Wang, F. Tang, and H. Wu, “Hybrid genetic algorithm based on quantum computing for numerical optimization and parameter estimation,” Appl. Math. Comput., vol. 171, no. 2, pp. 1141-1156, 2005.
    [72] Y. X. Wang, Z. D. Zhao, and R. Ren, “Hybrid particle swarm optimizer with tabu strategy for global numerical optimization,” in Proc. of the 2007 Congr. on Evol. Comput., pp. 2310-2316, Singapore, 2007.
    [73] C. Xiao, Z. Cai, Y. Wang, and X. Liu, “Tuning of the structure and parameters of a neural network using a good points set evolutionary strategy,” in Proc. of the 9th Int. Conf. Young Computer Scientists, pp. 1749-1754, Hunan, China, 2008.
    [74] P. Xie, B. Li, and Z. Q. Zhuang, “A new hybrid quantum evolutionary algorithm,” Computer Science, vol. 35, no. 2, pp. 166-170, 2008.
    [75] Y. Z. Xue, Y. L. Zhang, and J. P. Yuan, “Detection of outliers from spacecraft tracking data using GP-RBF network,” Journal of System Simulation, vol. 17, no. 2, pp.286-289, 2005.
    [76] S. Yang, M. Wang, and L. Jiao, “A novel quantum evolutionary algorithm and its application,” in Proc. of the 2004 Congr. on Evol. Comput., pp. 820-826, Washington, D.C., USA, Jun. 2004.
    [77] X. Yao, “Evolving artificial neural networks,” Proc. IEEE, vol. 87, no. 9, pp. 1423-1447, Sep. 1999.
    [78] X. Yao and Y. Liu, “A new evolutionary system for evolving artificial neural networks,” IEEE Trans. Neural Netw., vol. 8, no. 3, pp. 694-713, 1997.
    [79] X. Yao and Y. Liu, “Fast evolution strategies,” in Evolutionary Programming VI: Proc. of the Sixth Int. Conf. Evolutionary Programming (EP'97), P. J. Angeline, R. G. Reynolds, J. R. McDonnell, and R. Eberhart, Eds. Berlin, Germany: Springer, pp. 151-161, 1997.
    [80] X. Yao and Y. Liu, “Toward designing artificial neural networks by evolution,” Appl. Math. Comput., vol. 91, no. 1, pp. 83-90, 1998.
    [81] X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,” IEEE Trans. Evol. Comput., vol. 3, pp. 82-102, Jul. 1999.
    [82] G. X. Zhang, N. Li, W. D. Jin, and L. Z. Hu, “A novel quantum genetic algorithm and itsapplication,” ACTA Electronica Sinica, vol. 32, pp. 476-479, 2004.
    [83] G. X. Zhang and H. N. Rong, “Real-observation quantum inspired evolutionary algorithm for a class of numerical optimization problems,” ICCS2007, pp. 989-996, Beijing, China, 2007.
    [84] R. Zhang and H. Gao, “Real-coded quantum evolutionary algorithm for complex function with high dimension,” in Proceedings of IEEE International Conference on Mechatronics and Automation, pp. 2974-2979, Harbin, Heilongjiang, China, Aug. 2007.
    [85] S. Zhao, J. J. Liang, P. N. Suganthan, and M. F. Tasgetiren, “Dynamic multi-swarm particle swarm optimizer with local search for large scale global optimization,” in Proc. of the 2008 Congr. on Evol. Comput., pp. 3845-3852, Hong Kong, 2008.
    [86] W. Zhong, J. Liu, M. Xue, and L. Jiao, “A multiagent genetic algorithm for global numerical optimization,” IEEE Trans. on Syst., Man and Cybern., vol. 34, no. 2, pp. 1128-1141, Apr. 2004.
    [87] Y. Zhao, Y. Chen, M. Pan, and Q. Zhu, “A region reproduction algorithm for global numerical optimization,” in Proc. of the 2008 Congr. on Evol. Comput., pp. 3601-3605, Hong Kong, Jun. 2008.
    [88] Z. Y. Zhu and K. S. Leung, “Asynchronous self-adjustable island genetic algorithm for multi-objective optimization problems,” in Proc. of the 2002 Congr. on Evol. Comput., pp. 837-842, Piscataway, New Jersey, May. 2002.

    下載圖示 校內:2014-12-01公開
    校外:2014-12-01公開
    QR CODE