| 研究生: |
李憲龍 Lee, Xian-Long |
|---|---|
| 論文名稱: |
基於多代理人增強學習之自適應電動車充電決策 Adaptive Charging Strategy for Electric Vehicle Based on Multiagent Reinforcement Learning |
| 指導教授: |
楊宏澤
Yang, Hong-Tzer |
| 學位類別: |
碩士 Master |
| 系所名稱: |
電機資訊學院 - 電機工程學系碩士在職專班 Department of Electrical Engineering (on the job class) |
| 論文出版年: | 2020 |
| 畢業學年度: | 108 |
| 語文別: | 英文 |
| 論文頁數: | 57 |
| 中文關鍵詞: | 電動車 、自適應性充電策略 、增強式學習 、Double Deep Q Network 模型 |
| 外文關鍵詞: | Electric Vehicle, Adaptive Charging Strategy, Reinforcement Learning, Double Deep Q Network |
| 相關次數: | 點閱:114 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
針對於降低溫室氣體以及替換舊型燃油汽車的議題上,近年來在電動車的創新研發及發展望成為汽車產業上的巨大衝擊。然而,電動車有著不同於燃油汽車的充電機制與特性,駕駛者在行駛中面臨多項挑戰。另如一般日常行駛的狀態下,面對著未知完整充電站資訊的情況,依照最小化充電成本或是充電時間成本決定最佳充電策略。另外,充電決策需最小化充電次數,如果決策中需要多次進行充電,則此策略顯得不夠實際。
本研究提出基於電動車之自適應性充電決策模型,在面對充電站充電價格或人氣度的資訊不確定性上,選擇基於成本考量或是時間考量之最佳充電站決策。本研究提出多代理人之增強式學習(Reinforcement Learning)模型,透過模型之特性學習電動車行駛用電紀錄以及充電站之歷史資訊,進而尋找並提出最佳充電策略。除此之外,本研究提出的充電策略另外考量到電池充電限制式,以確保最佳電池功能壽命,以及考量電動車抵達終點時,確保電池仍然有足夠的能量。本研究之充電策略模型可自行學習並適應未知的資訊環境,並只會在計畫時刻進行充電。本研究使用真實資料進行模型測試與研究,並驗證基於成本考量或是時間考量之不同充電策略紀錄正當性。另外,研究中測試四項特殊情境,藉此驗證模型的強健性與自適應性。
The electric vehicle (EV) is innovative and presents immense promise for the automotive industry in terms of greenhouse emission reduction and the replacement of internal combustion vehicles. Owing to the unique charging requirements and EV characteristics, the driver of an EV faces significant issues on a daily basis, such as determining a charging strategy based on minimizing charging costs or waiting time while exploring EV charging stations (CSs).
This thesis presents an adaptive charging strategy model that implements the EV in a manner so as to challenge the uncertainty of a CS in terms of the price or popularity rate, which are unknown but can be determined in the query time. A Multiagent Reinforcement Learning model is employed to acquire information from the EV consumption log and information regarding the CS for formulating a charging strategy based on cost and time efficiency. Furthermore, characteristics and constraints such as battery charging and end-of-day energy requirements are considered. The proposed model adapts to uncertainties when making charging decisions regarding preferable CSs at planned periods if necessary. It demonstrates an adaptive charging strategy considering cost or time, and consuming requirements, with a simulation using real-world data. Four extreme conditions are tested to verify the robustness and adaptivity of the model.
[1] J. Greenblatt, S. Saxena, “Autonomous taxis could greatly reduce greenhouse-gas emissions of US light-duty vehicles,” Nature Climate Change 5, 860–863 (2015).
[2] C. Luo, Y. Huang and V. Gupta, “Placement of EV Charging Stations—Balancing Benefits Among Multiple Entities,” IEEE Transactions on Smart Grid, vol. 8, no. 2, pp. 759-768, March 2017.
[3] T. Winkler, P. Komarnicki, G. Mueller, G. Heideck, M. Heuer and Z. A. Styczynski, “Electric vehicle charging stations in Magdeburg,” in Proc. 2009 IEEE Vehicle Power and Propulsion Conference, Dearborn, MI, 2009, pp. 60-65.
[4] M. Woody, M. Azbabzadeh, G. M. Lewis, G. A. Keoleian, and A. Stefanopoulou, “Strategies to limit degradation and maximize Li-ion battery service lifetime-Critical review and guidance for stakeholders,” Journal of Energy Storage 28 (2020): 101231.
[5] K. Clement-Nyns, E. Haesen and J. Driesen, “The Impact of Charging Plug-In Hybrid Electric Vehicles on a Residential Distribution Grid,” IEEE Transactions on Power Systems, vol. 25, no. 1, pp. 371-380, Feb. 2010.
[6] W. Lee, R. Schober and V. W. S. Wong, “An Analysis of Price Competition in Heterogeneous Electric Vehicle Charging Stations,” IEEE Transactions on Smart Grid, vol. 10, no. 4, pp. 3990-4002, July 2019.
[7] O. Worley, D. Klabjan and T. M. Sweda, “Simultaneous vehicle routing and charging station siting for commercial Electric Vehicles,” in Proc. 2012 IEEE International Electric Vehicle Conference, Greenville, SC, 2012, pp. 1-3.
[8] H. Yang, S. Yang, Y. Xu, E. Cao, M. Lai and Z. Dong, “Electric Vehicle Route Optimization Considering Time-of-Use Electricity Price by Learnable Partheno-Genetic Algorithm,” IEEE Transactions on Smart Grid, vol. 6, no. 2, pp. 657-666, March 2015.
[9] S. Yang, W. Cheng, Y. Hsu, C. Gan, and Y. Lin “Charge scheduling of electric vehicles in highways.” Mathematical and Computer Modelling 57.11-12 (2013): 2873-2882.
[10] Y. Cao, N. Wang and G. Kamel, “A publish/subscribe communication framework for managing electric vehicle charging,” in Proc. 2014 International Conference on Connected Vehicles and Expo (ICCVE), Vienna, 2014, pp. 318-324.
[11] Y. Cao, T. Wang, O. Kaiwartya, G. Min, N. Ahmad and A. H. Abdullah, “An EV Charging Management System Concerning Drivers’ Trip Duration and Mobility Uncertainty,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 48, no. 4, pp. 596-607, April 2018.
[12] T. Panayiotou, S. P. Chatzis, C. Panayiotou and G. Ellinas, “Charging Policies for PHEVs used for Service Delivery: A Reinforcement Learning Approach,” in Proc. 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, 2018, pp. 1514-1521.
[13] N. Sadeghianpourhamami, J. Deleu and C. Develder, “Definition and Evaluation of Model-Free Coordination of Electrical Vehicle Charging With Reinforcement Learning,” IEEE Trans. on Smart Grid, vol. 11, no. 1, pp. 203-214, Jan. 2020.
[14] J. R. Vázquez-Canteli, and Z. Nagy, “Reinforcement learning for demand response: A review of algorithms and modeling techniques,” Applied energy 235, pp. 1072-1089, Feb. 2019.
[15] K. Valogianni, W. Ketter, and J. Collins. “Smart charging of electric vehicles using reinforcement learning.” Workshops at the Twenty-Seventh AAAI Conference on Artificial Intelligence, 2013.
[16] S. Dimitrov and R. Lguensat, “Reinforcement Learning Based Algorithm for the Maximization of EV Charging Station Revenue,” in Proc. 2014 International Conference on Mathematics and Computers in Sciences and in Industry, Varna, 2014, pp. 235-239.
[17] F. L. D. Silva, C. E. H. Nishida, D. M. Roijers and A. H. R. Costa, “Coordination of Electric Vehicle Charging Through Multiagent Reinforcement Learning,” IEEE Trans. on Smart Grid, vol. 11, no. 3, pp. 2347-2356, May 2020.
[18] W. Shi and V. W. S. Wong, “Real-time vehicle-to-grid control algorithm under price uncertainty,” in Proc. 2011 IEEE International Conference on Smart Grid Communications (SmartGridComm), Brussels, 2011, pp. 261-266.
[19] Y. Zhang, P. Sun, Y. Yin, L. Lin and X. Wang, “Human-like Autonomous Vehicle Speed Control by Deep Reinforcement Learning with Double Q-Learning,” in Proc. 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, pp. 1251-1256, June 2018.
[20] H. V. Hasselt, A. Guez, and D. Silver. “Deep reinforcement learning with double q-learning,” in Proc. Thirtieth AAAI conference on artificial intelligence, March 2016.
[21] C. J. C. H. Watkins and P. Dayan, “Q-learning,” Machine learning, 8.3-4, pp. 279-292, 1992.
[22] V. François-Lavet, P. Henderson, R. Islam, M. G. Bellemare, and J. Pineau “An introduction to deep reinforcement learning.” arXiv preprint arXiv:1811.12560 2018.
[23] R. S. Sutton and A. G. Barto. “Introduction to reinforcement learning,” Vol. 135. Cambridge: MIT press, 1998. [online]. Available: http://incompleteideas.net/book/first/the-book.html
[24] V. Mnih, K. Kavukcuoglu, D. Silver, A. A. Rusu, J. Veness, M. G. Bellemare, A. Graves, M. Riedmiller, A. K. Fidjeland, G. Ostrovski, S. Petersen, C. Beattie, A. Sadik, I. Antonoglou, H. King, D. Kumaran, D. Wierstra, S. Legg and D. Hassabis, “Human-level Control through Deep Reinforcement Learning,” Nature 518(7540), pp. 529-533, Feb. 2015.
[25] V. Mnih, K. Kavukcuoglu, D. Silver, A. Graves, I. Antonoglou, D. Wierstra, and M. Riedmiller, “Playing Atari with Deep Reinforcement Learning,” arXiv preprint, arXiv:1312.5602, 2013.
[26] T. Schaul, J. Quan, I. Antonoglou, and D. Silver, “Prioritized experience replay,” arXiv preprint, arXiv:1511.05952, 2015.
[27] The Metropolitan Transportation Authority (MTA) Bus Time® Historical Data. [online]. Available:http://web.mta.info/developers/MTA-Bus-Time-historical-data.html
[28] A Better RoutePlanner [online]. Available: https://forum.abetterrouteplanner.com/blogs/entry/22-tesla-model-3-performance-vs-rwd-consumption-real-driving-data-from-233-cars/
[29] Drive Dundee Electric. Electric Vehicle Charging Sessions Dundee. [online]. Available: https://data.dundeecity.gov.uk/dataset/ev-charging-data
[30] Open Charge Map. [online]. Available: https://openchargemap.org/site
校內:2025-09-01公開