簡易檢索 / 詳目顯示

研究生: 田晴
Tien, Ching
論文名稱: 結合機率矩陣分解法以及階層式注意力網路之擷取商品面向與使用者喜好預測
Aspect Extraction and User preference prediction using Probabilistic Matrix Factorization and Hierarchical Attention Networks
指導教授: 劉任修
Liu, Ren-Shiou
學位類別: 碩士
Master
系所名稱: 管理學院 - 資訊管理研究所
Institute of Information Management
論文出版年: 2020
畢業學年度: 108
語文別: 中文
論文頁數: 58
中文關鍵詞: 深度學習矩陣分解注意力機制
外文關鍵詞: Deep Leaning, matrix factorization, attention mechanism
相關次數: 點閱:113下載:1
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 近年來,電子商務發展成熟,用戶透過在線評論來輔助購買決策,已變得越來越流行。但隨著網路上的評論文本快速成長,我們無法在短時間內獲得評論文本中,用戶在意的面向及喜愛程度為何,需要花時間閱讀整篇評論,才能獲得其評論資訊,而對於業者也需要花大量時間統計消費者評論文本,找出消費者在意的面向及喜愛程度,以供未來作商品改進。

    例如一則評論說明:「該手機的解析度很高,但效能很差」,其實體對象為手機,解析度和效能為「面向」,用戶對不同面向給出的評價可能不同。因此本研究利用潛在面向的概念,延伸階層式卷積注意力網路,將評論句子輸入深度學習模型中,並從句子中提取特徵向量,利用注意力機制將輸入的每個部分賦予不同的重要程度,分配較高的權重在用戶較為關注的面向評論描述上,進而提取關鍵的面向特徵向量及喜好程度,並結合機率矩陣分解法進行交互訓練。

    In recent years, with the development of e-commerce, the way that users through online reviews to make the purchase decisions has become more and more popular. However, with the rapid growth of online reviews, we can't get the user's aspects in a short time. It takes time to read the entire review to obtain its review information. For companies, they need to spend a lot of time on consumers' reviews to find out what consumers care about and how much they love.

    For example,"The resolution of this mobile phone is high, but the performance is very poor." The object is a mobile phone, and the resolution and performance are aspects. The rating of different aspects may be different. Therefore, we use the concept of the latent aspect to extend the Hierarchical Convolution Attention Network. The online reviews will be our deep learning model's input, and we extract the feature vector from the sentences. Then, our model extracts the key aspect feature vectors, and combine the Probability Matrix Factorization for rating prediction.

    目錄 摘要 i EXTENDED ABSTRACT ii 誌謝 v 目錄 vi 表目錄 viii 圖目錄 ix 1 緒論 1 1.1 背景及動機 1 1.2 研究目的 3 1.3 貢獻 4 1.4 論文架構 4 2 文獻探討 5 2.1 Aspect Extraction 6 2.1.1 規則方法 6 2.1.2 監督學習與非監督學習方法 7 2.1.3 Attention機制方法 8 2.2 Aspect情感分類 9 2.2.1 基於RecNN的Aspect情感分類模型 10 2.2.2 基於RNN的Aspect情感分類模型 10 2.2.3 基於CNN的Aspect情感分類模型 11 2.3 小結 13 3 研究方法 14 3.1 問題描述 14 3.2 模型架構 15 3.3 方法描述 16 3.3.1 機率矩陣分解法 17 3.3.2 階層式捲積Attention網路模型 19 3.3.3 HCAN模型結合機率矩陣分解法 27 4 實驗與分析 30 4.1 實驗架構及步驟 30 4.2 資料集與資料處理 32 4.2.1 資料集 32 4.2.2 資料處理 33 4.3 實驗環境與參數設定 35 4.4 實驗結果與分析 36 4.4.1 參數設定對實驗結果的影響 37 4.4.2 矩陣分解法評分預測RMSE比較 43 4.4.3 Aspect特徵提取比較 46 5 結論與未來發展 52 參考文獻 53

    Ba, J. L., Kiros, J. R., and Hinton, G. E. (2016). Layer normalization. arXiv preprint arXiv:1607.06450.
    Blei, D. M., Ng, A. Y., and Jordan, M. I. (2003). Latent dirichlet allocation. Journal of machine Learning research, 3(Jan):993–1022.
    Caruana, R. (1997). Multitask learning. Machine learning, 28(1):41–75.
    Clevert, D.-A., Unterthiner, T., and Hochreiter, S. (2015). Fast and accurate deep net- work learning by exponential linear units (elus). arXiv preprint arXiv:1511.07289.
    Dong, L., Wei, F., Tan, C., Tang, D., Zhou, M., and Xu, K. (2014). Adaptive recursive neural network for target-dependent twitter sentiment classification. In Proceedings of the 52nd annual meeting of the association for computational linguistics (volume 2: Short papers), pages 49–54.
    Gao, S., Ramanathan, A., and Tourassi, G. (2018). Hierarchical Convolutional Atten- tion Networks for Text Classification. In Proceedings of The Third Workshop on Representation Learning for NLP, pages 11–23. Association for Computational Lin- guistics.
    Gehring, J., Auli, M., Grangier, D., Yarats, D., and Dauphin, Y. N. (2017). Convolu- tional sequence to sequence learning. arXiv preprint arXiv:1705.03122.
    He, R., Lee, W. S., Ng, H. T., and Dahlmeier, D. (2017). An unsupervised neural attention model for aspect extraction. In Proceedings of the 55th Annual Meeting
    of the Association for Computational Linguistics (Volume 1: Long Papers), pages 388–397. Association for Computational Linguistics.
    Hu, M. and Liu, B. (2004). Mining and summarizing customer reviews. In Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining, pages 168–177. ACM.
    Hu, M., Zhao, S., Zhang, L., Cai, K., Su, Z., Cheng, R., and Shen, X. (2018). Can: Constrained attention networks for multi-aspect sentiment analysis. arXiv preprint arXiv:1812.10735.
    Jakob, N. and Gurevych, I. (2010). Extracting opinion targets in a single-and cross- domain setting with conditional random fields. In Proceedings of the 2010 conference on empirical methods in natural language processing, pages 1035–1045. Association for Computational Linguistics.
    Jin, W., Ho, H. H., and Srihari, R. K. (2009). A novel lexicalized hmm-based learning framework for web opinion mining. In Proceedings of the 26th annual international conference on machine learning, pages 465–472. Citeseer.
    Kim, Y. (2014). Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882.
    Li, J., Yang, H., and Zong, C. (2018). Document-level multi-aspect sentiment classifi- cation by jointly modeling users, aspects, and overall ratings. In Proceedings of the 27th International Conference on Computational Linguistics, pages 925–936.
    Li, Y.-M. and Li, T.-Y. (2013). Deriving market intelligence from microblogs. Decision Support Systems, 55(1):206–217.
    Liu, B. (2012). Sentiment analysis and opinion mining. Synthesis lectures on human language technologies, 5(1):1–167.
    Luo, Z., Huang, S., Xu, F. F., Lin, B. Y., Shi, H., and Zhu, K. (2018). Extra: Extracting prominent review aspects from customer feedback. In Proceedings of the 2018 Con- ference on Empirical Methods in Natural Language Processing, pages 3477–3486.
    Mikolov, T., Chen, K., Corrado, G., and Dean, J. (2013). Efficient estimation of word representations in vector space. 1st International Conference on Learning Represen- tations, ICLR 2013 - Workshop Track Proceedings, pages 1–12.
    Mnih, A. and Salakhutdinov, R. R. (2008). Probabilistic matrix factorization. In Ad- vances in neural information processing systems, pages 1257–1264.
    Nguyen, T. H. and Shirai, K. (2015). Phrasernn: Phrase recursive neural network for aspect-based sentiment analysis. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pages 2509–2514.
    Pergola, G., Gui, L., and He, Y. (2019). Tdam: A topic-dependent attention model for sentiment analysis. Information Processing & Management, 56(6):102084.
    Poria, S., Cambria, E., Ku, L.-W., Gui, C., and Gelbukh, A. (2014). A rule-based approach to aspect extraction from product reviews. In Proceedings of the second workshop on natural language processing for social media (SocialNLP), pages 28– 37.
    Rabiner, L. R. (1989). A tutorial on hidden markov models and selected applications in speech recognition. Proceedings of the IEEE, 77(2):257–286.
    Rana, T. A. and Cheah, Y.-N. (2017). A two-fold rule-based model for aspect extraction.
    Expert Systems with Applications, 89:273–285.
    Ruder, S., Ghaffari, P., and Breslin, J. G. (2016). A hierarchical model of reviews for aspect-based sentiment analysis. arXiv preprint arXiv:1609.02745.
    Schouten, K. and Frasincar, F. (2016). Survey on aspect-level sentiment analysis. IEEE Transactions on Knowledge and Data Engineering, 28(3):813–830.
    Siering, M., Deokar, A. V., and Janze, C. (2018). Disentangling consumer recommen- dations: Explaining and predicting airline recommendations based on online reviews. Decision Support Systems, 107:52–63.
    Socher, R., Lin, C. C., Manning, C., and Ng, A. Y. (2011). Parsing natural scenes and natural language with recursive neural networks. In Proceedings of the 28th international conference on machine learning (ICML-11), pages 129–136.
    Srinivas, S. and Rajendran, S. (2019). Topic-based knowledge mining of online student reviews for strategic planning in universities. Computers & Industrial Engineering, 128:974–984.
    Tang, D., Qin, B., Feng, X., and Liu, T. (2015). Effective lstms for target-dependent sentiment classification. arXiv preprint arXiv:1512.01100.
    Tubishat, M., Idris, N., and Abushariah, M. A. (2018). Implicit aspect extraction in sentiment analysis: Review, taxonomy, oppportunities, and open challenges. Infor- mation Processing & Management, 54(4):545–563.
    Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł., and Polosukhin, I. (2017). Attention is all you need. In Advances in neural information processing systems, pages 5998–6008.
    Wang, H., Lu, Y., and Zhai, C. X. (2011). Latent aspect rating analysis without aspect keyword supervision. Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.
    Wang, L., Liu, K., Cao, Z., Zhao, J., and De Melo, G. (2015). Sentiment-aspect ex- traction based on restricted boltzmann machines. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 616–625.
    Wang, Y., Huang, M., Zhao, L., et al. (2016). Attention-based lstm for aspect-level sentiment classification. In Proceedings of the 2016 conference on empirical methods in natural language processing, pages 606–615.
    Wu, F. and Huang, Y. (2016). Personalized microblog sentiment classification via multi- task learning. In Thirtieth AAAI Conference on Artificial Intelligence.
    Xue, W. and Li, T. (2018). Aspect based sentiment analysis with gated convolutional networks. arXiv preprint arXiv:1805.07043.
    Yan, X., Guo, J., Lan, Y., and Cheng, X. (2013). A biterm topic model for short texts. In Proceedings of the 22nd international conference on World Wide Web, pages 1445– 1456. ACM.
    Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., and Hovy, E. (2016). Hierarchical attention networks for document classification. In Proceedings of the 2016 confer- ence of the North American chapter of the association for computational linguistics: human language technologies, pages 1480–1489.
    Yin, W., Schu¨tze, H., Xiang, B., and Zhou, B. (2016). Abcnn: Attention-based convo- lutional neural network for modeling sentence pairs. Transactions of the Association for Computational Linguistics, 4:259–272.
    Zeng, J., Ma, X., and Zhou, K. (2019). Enhancing attention-based lstm with position context for aspect-level sentiment classification. IEEE Access, 7:20462–20471.
    Zhang, L., Liu, B., Lim, S. H., and O’Brien-Strain, E. (2010). Extracting and ranking product features in opinion documents. In Proceedings of the 23rd international conference on computational linguistics: Posters, pages 1462–1470. Association for Computational Linguistics.
    Zhang, S., Xu, X., Pang, Y., and Han, J. (2019). Multi-layer attention based cnn for target-dependent sentiment classification. Neural Processing Letters, pages 1–15.
    Zhou, J., Huang, J. X., Chen, Q., Hu, Q. V., Wang, T., and He, L. (2019). Deep learn- ing for aspect-level sentiment classification: Survey, vision, and challenges. IEEE Access, 7:78454–78483.

    無法下載圖示 校內:2025-07-01公開
    校外:不公開
    電子論文尚未授權公開,紙本請查館藏目錄
    QR CODE