| 研究生: |
邱伊禕 Chiu, Yi-I |
|---|---|
| 論文名稱: |
以高維球做輔助的判別分析法 Hypersphere Distribution Discriminant Analysis |
| 指導教授: |
羅錦興
Luo, Ching-Hsing 詹寶珠 Chung, Pau-Choo |
| 學位類別: |
碩士 Master |
| 系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
| 論文出版年: | 2012 |
| 畢業學年度: | 100 |
| 語文別: | 英文 |
| 論文頁數: | 30 |
| 中文關鍵詞: | 判別分析法 、線性降維 |
| 外文關鍵詞: | dimensionality reduction |
| 相關次數: | 點閱:76 下載:4 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
在本篇論文中我們提出了一種新穎的線性降維判別分析法。藉由判斷鄰近空間中異質點的多寡及分布狀況,每一對同質點被給予不同的權重。此權重矩陣可決定一對同質點在新空間中是否可以被投影靠近。此方法成功的改善了過去判別分析法只根據同質點間距離規定在新空間的投影而產生的類別混雜問題。在圖形識別與二維資料分布上都得到較好的結果。
Many supervised linear dimensionality reduction methods face tradeoffs when deciding to preserve the within-class multimodality or to achieve a better between-class separation. These algorithms tend to preserve the neighborhood structure in the original space, and leave the determination to the optimization process. In this paper, we propose Hypersphere Distribution Discriminant Analysis (HDDA) to determine the projection of samples in the same class by defining a new within-class affinity matrix. This matrix is based on the distribution of nearby samples in different classes (heteropoints). When more heteropoints appear in the neighborhood space between a pair of the within-class samples, this pair should be projected separately to avoid mixing problems. Otherwise, the pair could be either projected together or not as long as better accuracy achieved. Considering both the distribution of heteropoints and the distance between the within-class pairs, HDDA shows effective results compared with the state of the art methods.
[1] Fukunaga, K. (1990) Introduction to Statistical Pattern Recognition. Academic Press, Inc., Boston, second edition
[2] Sugiyama, M. (2007) Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis. Journal of Machine Learning Research, vol.8, pp.1027-1061.
[3] He, X. & Niyogi, P. (2004) Locality Preserving Projections. Advances in Neural Information Processing Systems 16
[4] Ratsch, G., Onoda, T. & Muller, K.-R (2001) Soft Margins for Adaboost. Machine Learning vol.42, pp.287-320 [http://www.fml.tuebingen.mpg.de/Members/raetsch/benchmark]
[5] Frank, A.&Asuncion, A. (2010). UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science
[6] Vapnik, V. N. (1998) Statistical Learning Theory. Wiley, New York
[7] Duda, R., Hart, P. & Stor, D. (2001) Pattern Classification. Wiley, New York
[8] Ham, J., Lee, D. D., Mika, S. & Scholkopf. (2004) A kernel view of the dimensionality reduction of manifolds. In Proceedings of the Twenty-First International Conference on Machine Learning, New York, NY.
[9] Cai, D., Han, J., He, X., Zhou, K. & Bao, H. (2007) Locality Sensitive Discrminant Analysis. In Proceeding of the Twentieth International Joint Conference on Artificial Intelligence (IJCAI)
[10] Chen, H-T., Chang, H-W. & Liu, T-Y. (2005) Local Discriminant Embedding and its Variants. In Proceeding of the Eighteenth IEEE Conference on Computer Vision and Pattern Recognition. (CVPR)
[11] Yan, S., Xu, D., Zhang, B. & Zhang, H-J. (2005) Graph Embedding: A General Framework for Dimensionality Reduction. In Proceeding of the Eighteenth IEEE Conference on Computer Vision and Pattern Recognition. (CVPR)
[12] Na, J. H., Park, M. S., & Choi, J. Y. (2009) Linear Boundary Discriminant Analysis. Pattern Recognition.
[13] Goldberger J., Roweis S., Hinton, G. & Salakhutdinov, R. (2005) Neighbourhood Components Analysis. Advances in Neural Information Processing Systems 17