| 研究生: |
陳俊儒 Chen, Jun-Ru |
|---|---|
| 論文名稱: |
Federated Adaptive Cluster:可自動分群的聯盟式集群算法 Federated Adaptive Cluster: A Federated Cluster Algorithm with Automatic Clustering |
| 指導教授: |
賴槿峰
Lai, Chin-Feng |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 智慧製造國際碩士學位學程 International Master Program on Intelligent Manufacturing |
| 論文出版年: | 2022 |
| 畢業學年度: | 110 |
| 語文別: | 英文 |
| 論文頁數: | 58 |
| 中文關鍵詞: | 聯邦式學習 、分群 、Non-IID data |
| 外文關鍵詞: | federated learning, clustering, Non-IID data |
| 相關次數: | 點閱:82 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
隨著物聯網設備的增加,越來越多設備參與進機器學習中,但其中存在的隱私洩漏的風險。為了保證隱私權,聯邦式學習被提出。然而在實際行使上會遇到以下四個問題:通信成本、數據異質性、設備異質性、隱私安全。本論文中提出針對聯邦式學習的集群算法,試圖解決數據異質性的問題,並更進一步提出能減少分群時負擔的算法。分群方式採用一種自動的分群方法,而非硬性規定集群的數量。在結果中顯示該算法在特定的客戶端分布中具有強大的性能,能顯著提升訓練模型最終的精準度。
With the increase of IoT devices, more and more devices are involved in machine learning, but there is a risk of privacy leakage. In order to ensure privacy, federated learning is proposed. However, four problems are encountered in the practical implementation: communication cost, data heterogeneity, device heterogeneity, and privacy security. In this paper, we propose a clustering algorithm for federated learning to solve the problem of data heterogeneity and further propose an algorithm to reduce the burden of clustering. The clustering approach uses an automatic clustering method rather than a rigid specification of the number of clusters. In the results, it is shown that the algorithm has a strong performance in a specific client distribution and can significantly improve the final accuracy of the trained model.
S. Boyd, N. Parikh, E. Chu, B. Peleato, J. Eckstein, et al., “Distributed optimization and statistical learning via the alternating direction method of multipliers,” Foundations and Trends® in Machine learning, vol. 3, no. 1, pp. 1–122, 2011.
O. Dekel, R. Gilad-Bachrach, O. Shamir, and L. Xiao, “Optimal distributed online pre- diction using mini-batches.,” Journal of Machine Learning Research, vol. 13, no. 1, 2012.
J. Dean, G. Corrado, R. Monga, K. Chen, M. Devin, M. Mao, M. Ranzato, A. Senior, P. Tucker, K. Yang, et al., “Large scale distributed deep networks,” Advances in neural information processing systems, vol. 25, 2012.
Y. Zhang, M. J. Wainwright, and J. C. Duchi, “Communication-efficient algorithms for statistical optimization,” Advances in neural information processing systems, vol. 25, 2012.
M. Li, D. G. Andersen, A. J. Smola, and K. Yu, “Communication efficient distributed machine learning with the parameter server,” Advances in Neural Information Process- ing Systems, vol. 27, 2014.
O. Shamir, N. Srebro, and T. Zhang, “Communication-efficient distributed optimization using an approximate newton-type method,” in International conference on machine learning, pp. 1000–1008, PMLR, 2014.
S. Zhang, A. E. Choromanska, and Y. LeCun, “Deep learning with elastic averaging sgd,” Advances in neural information processing systems, vol. 28, 2015.
S. J. Reddi, J. Konečnỳ, P. Richtárik, B. Póczós, and A. Smola, “Aide: Fast and communication efficient distributed optimization,” arXiv preprint arXiv:1608.06879, 2016.
P. Richtárik and M. Takáč, “Distributed coordinate descent method for learning with big data,” The Journal of Machine Learning Research, vol. 17, no. 1, pp. 2657–2681, 2016.
V. Smith, S. Forte, M. Chenxin, M. Takáč, M. I. Jordan, and M. Jaggi, “Cocoa: A general framework for communication-efficient distributed optimization,” Journal of Machine Learning Research, vol. 18, p. 230, 2018.
J. Konečnỳ, H. B. McMahan, F. X. Yu, P. Richtárik, A. T. Suresh, and D. Bacon, “Fed- erated learning: Strategies for improving communication efficiency,” arXiv preprint arXiv:1610.05492, 2016.
F. Sattler, S. Wiedemann, K.-R. Müller, and W. Samek, “Robust and communication-efficient federated learning from non-iid data,” IEEE transactions on neural networks and learning systems, vol. 31, no. 9, pp. 3400–3413, 2019.
X. Zhang, X. Zhu, J. Wang, H. Yan, H. Chen, and W. Bao, “Federated learning with adaptive communication compression under dynamic bandwidth and unreliable networks,” Information Sciences, vol. 540, pp. 242–262, 2020.
B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y Arcas, “Communication-efficient learning of deep networks from decentralized data,” in Artificial intelligence and statistics, pp. 1273–1282, PMLR, 2017.
G. Paragliola and A. Coronato, “Definition of a novel federated learning approach to reduce communication costs,” Expert Systems with Applications, vol. 189, p. 116109, 2022.
T. Nishio and R. Yonetani, “Client selection for federated learning with heterogeneous resources in mobile edge,” in ICC 2019-2019 IEEE international conference on communications (ICC), pp. 1–7, IEEE, 2019.
Y. Wang and B. Kantarci, “A novel reputation-aware client selection scheme for federated learning within mobile environments,” in 2020 IEEE 25th International Workshop on Computer Aided Modeling and Design of Communication Links and Networks (CAMAD), pp. 1–6, IEEE, 2020.
J. Liu, J. H. Wang, C. Rong, Y. Xu, T. Yu, and J. Wang, “Fedpa: An adaptively partial model aggregation strategy in federated learning,” Computer Networks, vol. 199, p. 108468, 2021.
N. Cha, Z. Du, C. Wu, T. Yoshinaga, L. Zhong, J. Ma, F. Liu, and Y. Ji, “Fuzzy logic based client selection for federated learning in vehicular networks,” IEEE Open Journal of the Computer Society, vol. 3, pp. 39–50, 2022.
B. Recht, C. Re, S. Wright, and F. Niu, “Hogwild!: A lock-free approach to parallelizing stochastic gradient descent,” Advances in neural information processing systems, vol. 24, 2011.
C. Xie, S. Koyejo, and I. Gupta, “Asynchronous federated optimization,” arXiv preprint arXiv:1903.03934, 2019.
W. Wu, L. He, W. Lin, R. Mao, C. Maple, and S. Jarvis, “Safa: a semi-asynchronous protocol for fast federated learning with low overhead,” IEEE Transactions on Computers, vol. 70, no. 5, pp. 655–668, 2020.
Y. Zhou, Q. Ye, and J. Lv, “Communication-efficient federated learning with compensated overlap-fedavg,” IEEE Transactions on Parallel and Distributed Systems, vol. 33, no. 1, pp. 192–205, 2021.
T. Li, A. K. Sahu, M. Zaheer, M. Sanjabi, A. Talwalkar, and V. Smith, “Federated optimization in heterogeneous networks,” Proceedings of Machine Learning and Systems, vol. 2, pp. 429–450, 2020.
X. Li, Z. Qu, B. Tang, and Z. Lu, “Stragglers are not disaster: A hybrid federated learning algorithm with delayed gradients,” arXiv preprint arXiv:2102.06329, 2021.
N. Carlini, C. Liu, Ú. Erlingsson, J. Kos, and D. Song, “The secret sharer: Evaluating and testing unintended memorization in neural networks,” in 28th USENIX Security Symposium (USENIX Security 19), pp. 267–284, 2019.
R. C. Geyer, T. Klein, and M. Nabi, “Differentially private federated learning: A client level perspective,” arXiv preprint arXiv:1712.07557, 2017.
S. Truex, N. Baracaldo, A. Anwar, T. Steinke, H. Ludwig, R. Zhang, and Y. Zhou, “A hybrid approach to privacy-preserving federated learning,” in Proceedings of the 12th ACM workshop on artificial intelligence and security, pp. 1–11, 2019.
C. Fang, Y. Guo, N. Wang, and A. Ju, “Highly efficient federated learning with strong privacy preservation in cloud computing,” Computers & Security, vol. 96, p. 101889, 2020.
A.-T. Tran, T.-D. Luong, J. Karnjana, and V.-N. Huynh, “An efficient approach for privacy preserving decentralized deep learning models based on secure multi-party computation,” Neurocomputing, vol. 422, pp. 245–262, 2021.
H. Zhu, R. Wang, Y. Jin, K. Liang, and J. Ning, “Distributed additive encryption and quantization for privacy preserving federated deep learning,” Neurocomputing, vol. 463, pp. 309–327, 2021.
H. Fang and Q. Qian, “Privacy preserving machine learning with homomorphic encryption and federated learning,” Future Internet, vol. 13, no. 4, p. 94, 2021.
J. Park and H. Lim, “Privacy-preserving federated learning using homomorphic encryption,” Applied Sciences, vol. 12, no. 2, p. 734, 2022.
Y. Zhao, M. Li, L. Lai, N. Suda, D. Civin, and V. Chandra, “Federated learning with non-iid data,” arXiv preprint arXiv:1806.00582, 2018.
T.-M. H. Hsu, H. Qi, and M. Brown, “Measuring the effects of non-identical data distribution for federated visual classification,” arXiv preprint arXiv:1909.06335, 2019.
S. P. Karimireddy, S. Kale, M. Mohri, S. Reddi, S. Stich, and A. T. Suresh, “Scaffold: Stochastic controlled averaging for federated learning,” in International Conference on Machine Learning, pp. 5132–5143, PMLR, 2020.
S. Reddi, Z. Charles, M. Zaheer, Z. Garrett, K. Rush, J. Konečnỳ, S. Kumar, and H. B. McMahan, “Adaptive federated optimization,” arXiv preprint arXiv:2003.00295, 2020.
H. Wu and P. Wang, “Fast-convergent federated learning with adaptive weighting,” IEEE Transactions on Cognitive Communications and Networking, vol. 7, no. 4, pp. 1078–1088, 2021.
Z. Ma, M. Zhao, X. Cai, and Z. Jia, “Fast-convergent federated learning with class-weighted aggregation,” Journal of Systems Architecture, vol. 117, p. 102125, 2021.
V. Smith, C.-K. Chiang, M. Sanjabi, and A. S. Talwalkar, “Federated multi-task learning,” Advances in neural information processing systems, vol. 30, 2017.
M. Jaggi, V. Smith, M. Takác, J. Terhorst, S. Krishnan, T. Hofmann, and M. I. Jor-dan, “Communication-efficient distributed dual coordinate ascent,” Advances in neural information processing systems, vol. 27, 2014.
E. Jeong, S. Oh, H. Kim, J. Park, M. Bennis, and S.-L. Kim, “Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data,” arXiv preprint arXiv:1811.11479, 2018.
K. Wang, R. Mathews, C. Kiddon, H. Eichner, F. Beaufays, and D. Ramage, “Federated evaluation of on-device personalization,” arXiv preprint arXiv:1910.10252, 2019.
M. G. Arivazhagan, V. Aggarwal, A. K. Singh, and S. Choudhary, “Federated learning with personalization layers,” arXiv preprint arXiv:1912.00818, 2019.
Y. Jiang, J. Konečnỳ, K. Rush, and S. Kannan, “Improving federated learning personalization via model agnostic meta learning,” arXiv preprint arXiv:1909.12488, 2019.
F. Hanzely and P. Richtárik, “Federated learning of a mixture of global and local models,” arXiv preprint arXiv:2002.05516, 2020.
Y. Mansour, M. Mohri, J. Ro, and A. T. Suresh, “Three approaches for personalization with applications to federated learning,” arXiv preprint arXiv:2002.10619, 2020.
Q. Wu, K. He, and X. Chen, “Personalized federated learning for intelligent iot applications: A cloud-edge based framework,” IEEE Open Journal of the Computer Society, vol. 1, pp. 35–44, 2020.
C. Briggs, Z. Fan, and P. Andras, “Federated learning with hierarchical clustering of local updates to improve training on non-iid data,” in 2020 International Joint Conference on Neural Networks (IJCNN), pp. 1–9, IEEE, 2020.
F. Sattler, K.-R. Müller, and W. Samek, “Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints,” IEEE transactions on neural networks and learning systems, vol. 32, no. 8, pp. 3710–3722, 2020.
A. Ghosh, J. Chung, D. Yin, and K. Ramchandran, “An efficient framework for clustered federated learning,” Advances in Neural Information Processing Systems, vol. 33, pp. 19586–19597, 2020.
C. Li, G. Li, and P. K. Varshney, “Federated learning with soft clustering,” IEEE Internet of Things Journal, 2021.
D. J. Beutel, T. Topal, A. Mathur, X. Qiu, T. Parcollet, P. P. de Gusmão, and N. D. Lane, “Flower: A friendly federated learning research framework,” arXiv preprint arXiv:2007.14390, 2020.
校內:2027-07-26公開