簡易檢索 / 詳目顯示

研究生: 江旻修
Chiang, Min-Hsiu
論文名稱: 域適應演算法的加速與其應用
The Acceleration of Domain Adaptation Algorithm and Its Applications
指導教授: 劉聚仁
Liu, Gi-Ren
學位類別: 碩士
Master
系所名稱: 理學院 - 數學系應用數學碩博士班
Department of Mathematics
論文出版年: 2023
畢業學年度: 112
語文別: 英文
論文頁數: 63
中文關鍵詞: 域適應核方法機器學習簡化核矩陣睡眠深度
外文關鍵詞: Domain adaptation, Kernel method, Machine learning, Reduced kernel matrix, Sleep stage
相關次數: 點閱:141下載:15
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在這篇論文中,我們將探討資料集偏移的問題,特別是共變數偏移,並討論數種域適應演算法,例如以核方法為基底的遷移成份分析以及非核方法的子空間校準。由於以核方法為基底的演算法的計算時間會因為資料量的提升而劇烈增加,我們使用了簡化核矩陣的技巧來對遷移成份分析進行加速。為了觀察這些演算法在不同情境下的表現,我們將使用數種玩具資料集來對演算法進行測試。最後將這些演算法應用在睡眠深度分類上,觀察其在真實世界數據上的表現。

    In this thesis, we will discuss the problem of dataset shift, especially covariate shift, and then introduce domain adaptation algorithms such as transfer component analysis (TCA) which is based on the kernel method, subspace alignment (SA) which
    aims to find a transformation of coordinate system. Because the computation time of the kernel-based algorithm increases dramatically along with the growth of the number of data, we utilize the technique of reduced kernel matrix to accelerate the TCA algorithm. The toy examples, which represent the different types of covariate shift, will be shown in detail to compare the performance of different domain adaptation algorithms and demonstrate the effectiveness of our proposed algorithm. Finally, we apply these algorithms to the real-data application: the sleep stages classification.

    中文摘要 i Abstract ii Contents iii List of Tables v List of Figures vi 1 Introduction 1 2 Dataset Shift 3 2.1 Covariate shift 4 3 Domain Adaptation Algorithms 6 3.1 Subspace alignment 6 3.2 Transfer component analysis 8 3.2.1 Kernel method 8 3.2.2 Maximum mean discrepancy (MMD) 11 3.2.3 Reducing the discrepancy 13 3.2.4 Preserving data property 17 3.2.5 TCA algorithm 18 3.3 Reduced TCA 21 4 Experiment on Toy Examples 24 4.1 Toy example 1: translation 25 4.2 Toy example 2: translation and rotation 28 4.3 Toy example 3: the effect of rotation angle 32 4.4 Toy example 4: the comparison between TCA and rTCA 35 5 Experiment on the Sleep Stage Classification 38 5.1 Algorithm of sleep stage classification model 39 5.2 Dataset Description 40 5.3 Experiment Result 41 5.3.1 Sleep-EDF SC and Sleep-EDF ST 42 5.3.2 MASS-SS1 and DREAMS 48 6 Conclusion 54 References 56 Appendix A: Various versions of convolution therorem 58 A.1 Proof of Theorem 1 58 A.2 Proof of Proposition 2 59 Appendix B: Description of EEG Datasets 60 B.1 Sleep-EDF Sleep Cassette (SC) 60 B.2 Sleep-EDF Sleep Telemetry (ST) 60 B.3 DREAMS 62 B.4 MASS-SS1 63

    [1] Basura Fernando, Amaury Habrard, Marc Sebban, and Tinne Tuytelaars. Unsupervised visual domain adaptation using subspace alignment. In 2013 IEEE International Conference on Computer Vision, pages 2960–2967, 2013.
    [2] Benyamin Ghojogh, Fakhri Karray, and Mark Crowley. Eigenvalue and generalized eigenvalue problems: Tutorial, 2023.
    [3] Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Sch¨olkopf, and Alexander J. Smola. A kernel method for the two-sample problem. CoRR, abs/0805.2368, 2008.
    [4] Donald O. Hebb. The organization of behavior: A neuropsychological theory. Wiley, New York, June 1949.
    [5] Bob Kemp, A Zwinderman, B Tuk, H Kamphuisen, and J Obery´e. Sleep-edf database expanded. physionet. org, 2018.
    [6] Yuh-Jye Lee and Olvi Mangasarian. Rsvm: Reduced support vector machines. 1, 04 2001.
    [7] Gi-Ren Liu, Yu-Lun Lo, John Malik, Yuan-Chung Sheu, and Hau-Tieng Wu. Diffuse to fuse eeg spectra–intrinsic geometry of sleep dynamics for classification. Biomedical Signal Processing and Control, 55:101576, 2020.
    [8] Christian O’Reilly, Nadia Gosselin, Julie Carrier, and Tore Nielsen. Montreal archive of sleep studies: an open-access resource for instrument benchmarking and exploratory research. Journal of sleep research, 23(6):628—635, December 2014.
    [9] Sinno Jialin Pan, Ivor W. Tsang, James T. Kwok, and Qiang Yang. Domain adaptation via transfer component analysis. IEEE Transactions on Neural Networks, 22(2):199–210, 2011.
    [10] Joaquin Qui˜nonero-Candela, Masashi Sugiyama, Anton Schwaighofer, and Neil D. Lawrence. Dataset Shift in Machine Learning. The MIT Press, 12 2008.
    [11] Hidetoshi Shimodaira. Improving predictive inference under covariate shift by weighting the log-likelihood function. Journal of Statistical Planning and Inference, 90(2):227–244, 2000.
    [12] Alex Smola, Arthur Gretton, Le Song, and Bernhard Sch¨olkopf. A hilbert space embedding for distributions. In Marcus Hutter, Rocco A. Servedio, and Eiji Takimoto, editors, Algorithmic Learning Theory, pages 13–31, Berlin, Heidelberg, 2007. Springer Berlin Heidelberg.
    [13] William Wong, Thomas Andrillon, Nicolas Decat, Valdas Noreika, Katja Valli, Jennifer Windt, and Naotsugu Tsuchiya. The DREAM database. 5 2023.
    [14] M.J. Zaki and W. Meira. Data Mining and Machine Learning: Fundamental Concepts and Algorithms. Cambridge University Press, 2020.

    下載圖示 校內:立即公開
    校外:立即公開
    QR CODE