| 研究生: |
陳俊維 Chen, Chun-Wei |
|---|---|
| 論文名稱: |
基於串聯式特徵選取及支援向量機之快速分類法 SVM Based Fast Classification Using Cascade Feature Selection |
| 指導教授: |
謝明得
Shieh, Ming-Der |
| 學位類別: |
碩士 Master |
| 系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
| 論文出版年: | 2012 |
| 畢業學年度: | 100 |
| 語文別: | 英文 |
| 論文頁數: | 48 |
| 中文關鍵詞: | 特徵選取 、支援向量機 |
| 外文關鍵詞: | feature selection, SVM, support vector machine |
| 相關次數: | 點閱:80 下載:3 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
支援向量機(support vector machine)是一個前瞻的邊界最大化分類器,而此方法目前亦已被應用在很多領域中。然而實現分類器硬體時需考慮一個很重要的問題為儲存支援向量(support vector)的記憶體無法限制在一定的大小下,此記憶體大小與支援向量的總數有關,且支援向量總數最高可達整個訓練資料庫的資料數目。在不同應用領域中,訓練資料庫大小往往差異很大,即便在相同領域(如影像處理),其資料顆粒可小至以像素為單位,或大到以影片為單位;而資料顆粒與訓練資料取得難易度成正比,其難易度與最終所使用的訓練資料庫亦有關。
雖然已有很多研究著重在降低支援向量總數,但大部份方法仍無法維持其分類準確率。在本論文中,我們提出以串聯式特徵選取(cascade feature selection)為基礎的方式來減少支援向量總數,在訓練資料庫子集合中利用數個線性分類器將整體複雜度降低,進而達到減少支援向量的效果。由實驗結果可證實本論文所提出的演算法可有效地降低支援向量的總數量,同時其分類準確率亦可以逼近傳統輻狀基底函數支援向量機(radial basis function SVM)。
Support vector machine (SVM) is a state-of-art large margin classifier that has been applied in many applications. The main issue of developing SVM hardware classifiers is its unlimited support vector memory. The memory size depends on the number of support vectors, which are upper-bounded by the number of training samples. The size of training dataset varies with different applications. Even in the same application like image processing, data granularity is quite different for pixel and video sequence levels. Data granularity is positively proportional to the difficulty of data collection; the difficulty is related to the training dataset size.
Many techniques have been proposed to reduce the number of support vector; however, most of them may lead to the degradation in classification accuracy. In this work, we proposed a novel support vector reduction method using cascade feature selection. The complexity is reduced by applying several linear classifiers in dataset segments. Simulation results demonstrate that the proposed algorithm not only reduce the number of support vectors, but also has a comparable accuracy with that of traditional radial basis function (RBF) classifiers.
[1] C. J. C. Burges, “A tutorial on support vector machines for pattern recognition,” Data Mining and Knowledge Discovery, vol. 2, no. 2, pp. 121-167, 1998.
[2] G. Bakir, L. Bottou, and J. Weston, “Breaking svm complexity with cross-training,” in Proc. The 17th Annual Conference on Neural Information Processing Systems (NIPS), pp. 81-88, 2004.
[3] G. Chen, J. Xu, and X. Xiang, “Neighborhood preprocessing svm for large-scale data set classification,” in Proc. The 5th International Conference on Fuzzy Systems and Knowledge Discovery, pp. 245-249, 2008.
[4] Q. A. Tran, Q. L. Zhang, and X. Li, “Reduce the number of support vectors by using clustering techniques” in Proc. The 2nd International Conference on Machine Learning and Cybernetics, pp. 1245-1248, 2003.
[5] K. M. Lin and C. J. Lin, “A study on reduced support vector machines,” IEEE Tran. Neural Networks, vol. 14, no. 6, pp. 1449-1459, 2003.
[6] Y. J. Lee and O. L. Mangasarian, “RSVM: reduced support vector machines,” in Proc. The 1st SIAM International Conference on Data Mining, pp. 332-349, 2001.
[7] J. Suykens and J. Vandewalle, “Least square support vector machine classifiers,” Neural Processing Letters, vol. 9, no.3, pp. 293-300, 1999.
[8] O. L. Mangasarian and D. R. Musicant, “Lagrangian support vector machines,” Journal of Machine Learning Research, vol. 1, pp. 161-177, 2001.
[9] E. Osuna, R. Freund, and F. Girosi, “Training support vector machines: an application to face detection,” in Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 130-136, 1997.
[10] T. Joachims, “Making large-scale SVM learning practical,” in Advances in Kernel Methods – Support Vector Learning, B. Schölkopf, C. J. C. Burges, and A. J. Smola (editors), Cambridge, MA: MIT Press, pp. 169-184, 1998.
[11] J. C. Platt, “Fast training of support vector machines using sequential minimal optimization,” in Advances in Kernel Methods – Support Vector Learning, B. Schölkopf, C. J. C. Burges, and A. J. Smola (editors), Cambridge, MA: MIT Press, pp. 185-208, 1998.
[12] T. Habib, G. Mercier, and J. Chanussot, “Support vector reduction in SVM algorithm for abrupt change detection in remote sensing,” IEEE Geoscience and Remote Sensing letters, vol. 6, no. 3, pp. 606-610, 2009.
[13] T. Downs, K. Gates, and A. Masters, “Exact simplification of support vector solutions,” Journal of Machine Learning Research, vol. 2, pp. 293-297, 2001.
[14] C. Chang and C. Lin, “LIBSVM: a library for support vector machines,” available: http://www.csie.ntu.edu.tw/cjlin/libsvm, 2001.
[15] A. Jain, M. Murty, and P. Flynn, “Data clustering: a review,” ACM, the Association for Computing Machinery, Computing Surveys., vol. 31, no. 3, pp. 264-323, 1999.
[16] S. C. Johnson, “Hierarchical clustering schemes,” Psychometrika, vol. 32, pp. 241-254, 1967.
[17] J. Macqueen, “Some methods for classification and analysis of multivariate observations,” in Proc. The 5th Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 281-296, 1967.