| 研究生: |
王俊凱 Wang, Jiun-Kai |
|---|---|
| 論文名稱: |
最小群聚結構鑑別之自我調節群聚演算法及其應用 A Self-Regulating Clustering Algorithm for Identification of Minimal Cluster Configuration and Its Applications |
| 指導教授: |
王振興
Wang, Jeen-Shing |
| 學位類別: |
碩士 Master |
| 系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
| 論文出版年: | 2004 |
| 畢業學年度: | 92 |
| 語文別: | 英文 |
| 論文頁數: | 54 |
| 中文關鍵詞: | 函數近似 、徑向基類神經網路 、群聚演算法 |
| 外文關鍵詞: | function approximation, radial basis function networks, clustering algorithm |
| 相關次數: | 點閱:110 下載:6 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
群聚分析被認為是找出大量未分類資料內部的架構。在執行群聚分析時,最大的問題在於如何決定資料內部群組的數目。為了解決此類問題,已有大量的研究投入於研發群聚演算法使其具有系統化的機制自動地找出資料內部的群聚架構。從群聚演算法的文獻探討中,這些有效的機制可被分為成長、合併及分裂等三類。在群聚分析的過程中,每個機制擁有其各別的重要性並且扮演著不同的角色在找出群聚架構的資訊。本論文提出了一個自我調節群聚演算法,此演算法能夠在沒有任何先驗知識的條件下確認出資料的群聚結構。將成長、合併和分裂三個機制伴隨著即時操作功能的特性整合成自我調節群聚演算法。這些機制可以幫助自我調節群聚演算法確認出一個合適的資料群聚結構。除此之外,本論文也提出一個新的想法,稱之為群聚邊界估測,此方法能夠有效地利用精簡的高維度橢圓形狀邊界結構來表示最後群聚結果。邊界估測的方法是利用虛擬的群聚寬度和調節向量使得自我調節群聚演算法能夠找出接近實際情況的簡潔群聚結構。在最後利用兩個不同的應用,包括常被引用的測試範例以及利用徑向基類神經網路來實現函數近似的例子,來驗證自我調節群聚演算法的效能。
Clustering analysis can be considered as finding a structure in a collection of unlabeled data. One substantial problem in performing a cluster analysis is to determine the number of clusters in the data set. To solve such a problem, a great amount of research effort has been directed to equip clustering algorithms with systematic frameworks to automatically reveal the cluster configuration of the data set. From a review of the clustering literature, these effective frameworks can be classified into growing, merging or splitting mechanisms. Each of these mechanisms possesses its own significance and plays an important role in revealing the information of cluster configuration during clustering analysis. In this study, a self-regulating clustering algorithm (SRCA) has been developed to provide an alternative solution for identifying the cluster configuration without a priori knowledge regarding the given data set. Growing, merging and splitting mechanisms with online operation capability are integrated into the proposed SRCA. These mechanisms enable the SRCA to identify a suitable cluster configuration. In addition, a novel idea for cluster boundary estimation has been proposed to effectively maintain the resultant clusters with compact hyper-elliptic-shaped boundaries. A virtual cluster spread coupled with a regulating vector enables the proposed SRCA to reveal the compact cluster configuration which may close to the real one if it exists. Finally, the effectiveness of the proposed SRCA was validated by two applications—benchmark examples and function approximation problems using radial basis function networks.
[1] M. R. Anderberg, Cluster analysis for applications, New York: Academic Press, 1973.
[2] D. Young and A. J. Gray, “Semi-automatic boundary detection for identification of cells in DIC microscope images,” Proc. of IEEE Sixth Int'l Conf. on Image Processing and Its Applications, vol. 1, pp. 246-250, July 1997.
[3] C. H. Li and P. C. Yuen, “Regularized color clustering in medical image database,” IEEE Trans. on Medical Imaging, vol. 19, no. 11, pp. 1150-1155, Nov. 2000.
[4] A. Baraldi and P. Blonda, “A survey of fuzzy clustering algorithms for pattern recognition-part I,” IEEE Trans. on Systems, Man and Cybernetics, vol. 29, no. 6, pp. 778-785, Dec. 1999.
[5] A. K. Jain and R. C. Dubes, Algorithms for clustering data, New York: Prentice Hall, 1988.
[6] J. B. McQueen, “Some methods for classification and analysis of multivariate observations,” Proc. of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 281-297, 1967.
[7] E. H. Ruspini, “A new approach to clustering,” Information and Control, vol. 15, no. 1, pp. 22-32, July 1969.
[8] K. Krishna and M. Narasimha Murty, “Genetic k-means algorithm,” IEEE Trans. on Systems, Man and Cybernetics, Part B, vol. 29, no. 3, pp. 433-439, Dec. 1999.
[9] C. Chinrungrueng and C. H. Séquin, “Optimal adaptive k-means Algorithm with dynamic adjustment of learning rate,” IEEE Trans. on Neural Networks, vol. 6, no. 1, pp. 157-169, Jan. 1995.
[10] J. C. Bezdek, J. Keller, R. Krisnapuram and N. R. Pal, Fuzzy models and algorithms for pattern recognition and image processing, Boston: Kluwer Academic, 1999.
[11] T. A. Runkler and J. C. Bezdek, “Alternating cluster estimation: a new tool for clustering and function approximation,” IEEE Trans. on Fuzzy Systems, vol. 7, no. 4, pp. 377-393, Aug. 1999.
[12] D. E. Gustafson and W. C. Kessel, “Fuzzy clustering with a fuzzy covariance matrix,” Proc. of IEEE Conf. on Decision Control, San Diego, CA, pp. 761-766, 1979.
[13] U. Kaymak and M. Setnes, “Fuzzy clustering with volume prototypes and adaptive cluster merging,” IEEE Trans. on Fuzzy Systems, vol. 10, no. 6, pp. 705-712, Dec. 2002.
[14] A. M. Bensaid, L. O. Hall, J. C. Bezdek, L. P. Clarke, M. L. Silbiger, J. A. Arrington and R. F. Murtagh, “Validity-guided (re)clustering with applications to image segmentation,” IEEE Trans. on Fuzzy Systems, vol. 4, no. 2, pp. 112-123, May 1996.
[15] N. R. Pal and J. C. Bezdek, “On cluster validity for the fuzzy c-means model,” IEEE Trans. on Fuzzy Systems, vol. 3, no. 3, pp. 370-379, Aug. 1995.
[16] R. J. Hathaway and J. C. Bezdek, “Visual cluster validity for prototype generator clustering models,” Pattern Recognition Letters, vol. 24, no. 9-10, pp. 1573-1579, June 2003.
[17] X. L. Xie and G. Beni, “A validity measure for fuzzy clustering,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 13, no. 8, pp. 841-847, Aug. 1991.
[18] G. A. Carpenter, S. Grossberg and D. B. Rosen, “Fuzzy ART: fast stable learning and categorization of analog patterns by an adaptive resonance system,” Neural Networks, vol. 4, pp. 759-771, 1991.
[19] C. F. Juang and C. T. Lin, “An on-line self-constructing neural fuzzy inference network and its applications,” IEEE Trans. on Fuzzy Systems, vol. 6, no. 1, pp. 12-32, Feb. 1998.
[20] S. Abe, “Dynamic cluster generation for a fuzzy classifier with ellipsoidal regions,” IEEE Trans. on Systems, Man and Cybernetics, Part B, vol. 28, no. 6, pp. 869-876, Dec. 1998.
[21] Y. J. Zhang and Z. Q. Liu, “Self-splitting competitive learning: a new on-line clustering paradigm,” IEEE Trans. on Neural Networks, vol. 13, no. 2, pp. 369-380, March 2002.
[22] J. S. Wang and C. S. G. Lee, “Self-adaptive neuro-fuzzy inference systems for classification applications,” IEEE Trans. on Fuzzy Systems, vol. 10, no. 6, pp. 790-802, Dec. 2002.
[23] R. R. Yager and D. P. Filev, “Approximate clustering via the mountain method,” IEEE Trans. on Systems, Man and Cybernetics, vol. 24, no. 8, pp. 1279-1284, Aug. 1994.
[24] L. Kaufman and P. J. Rousseeuw, Finding groups in data: an introduction to cluster analysis, New York: Wiley, 1989.
[25] L. Acosta, G. N. Marichal, L. Moreno, J. A. Mendez and J. J. Rodrigo, “Obstacle avoidance using the human operator experience for a mobile robot,” Journal of Intelligent Robotic Syst., vol. 27, no. 4, pp. 305-319, 2000.
[26] S. G. Tzafestas and G. G. Rigatos, “Neural and neurofuzzy FELA adaptive robot control using feedforward and counterpropagation networks,” Journal of Intelligent Robotic Syst., vol. 23, pp. 291-330, 2000.
[27] S. L. Chiu, “Fuzzy model identification based on cluster estimation,” Journal of Intelligence and Fuzzy Systems, vol. 2, pp. 267-278, 1994.
[28] M. T. Musavi, W. Ahmed, K. H. Chen, K. B. Faris and D. M. Hummels, “On the training of radial basis function classifiers,” Neural Networks, vol. 5, pp. 595-603, 1992.
[29] L. Bruzzone and D. F. Prieto, “A technique for the selection of kernel-function parameters in RBF neural networks for classification of remote-sensing images,” IEEE Trans. on Geoscience and Remote Sensing, vol. 37, no. 2, pp. 1179-1184, March 1999.
[30] W. Pedrycz, “Conditional fuzzy clustering in the design of radial basis function neural networks,” IEEE Trans. on Neural Networks, vol. 9, no. 4, pp. 601-612, July 1998.
[31] Z. Uykan, C. Guzelis, M. E. Celebi and H. N. Koivo, “Analysis of input-output clustering for determining centers of RBFN,” IEEE Trans. on Neural Networks, vol. 11, no. 4, pp. 851-858, July 2000.
[32] R. Battiti, “First- and second-order methods for learning: between steepest descent and newton’s method,” Neural Comput., vol. 4, pp. 141-166, 1992.
[33] J. E. Dennis and R. B. Schnabel, Numerical methods for unconstrained optimization and nonlinear equations, Upper Saddle River, NJ: Prentice-Hall, 1983.
[34] J. González, I. Rojas, H. Pomares and J. Ortega, “A new clustering technique for function approximation,” IEEE Trans. on Neural Networks, vol. 13, no. 1, pp. 132-142, Jan. 2002.
[35] J. Alirezaie, M. E. Jernigan and C. Nahmias, “Neural network-based segmentation of magnetic resonance images of the brain,” IEEE Trans. on Nuclear Science, vol. 44, no. 2, pp. 194-198, April 1997.