| 研究生: |
丁羅邦芸 Ting, Lo Pang-Yun |
|---|---|
| 論文名稱: |
強化式能源管理於建築與充電基礎設施:從單智能體到多智能體優化 Reinforced Energy Management in Buildings and Charging Infrastructure: From single-agent to multi-agent optimization |
| 指導教授: |
莊坤達
Chuang, Kun-Ta |
| 學位類別: |
博士 Doctor |
| 系所名稱: |
電機資訊學院 - 資訊工程學系 Department of Computer Science and Information Engineering |
| 論文出版年: | 2025 |
| 畢業學年度: | 113 |
| 語文別: | 英文 |
| 論文頁數: | 134 |
| 中文關鍵詞: | 強化學習 、圖嵌入 、能源行為分析 、稀有事件偵測 、充電排程 、動態配置 、充電功率控制 |
| 外文關鍵詞: | Reinforcement Learning, Graph Representation, Energy Behavior Analysis, Rare Event Detection, Charging Scheduling, Dynamic Placement, Charging Power Control |
| 相關次數: | 點閱:32 下載:1 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
隨著智慧電表的廣泛部署與電動車日益普及,能源系統的複雜性不斷提高,使得建築與充電基礎設施中的高效能源管理在近年來變得極為重要。傳統基於特定規則或靜態最佳化方法,難以應對真實能源環境中動態與高度不確定的特性,例如動態能源需求、不確定的使用者行為,以及動態變化的電價。而強化學習具備自我調整學習的能力,已成為優化能源使用、行為偵測與協調分散式系統決策的強大技術。此外,單智能體與多智能體的強化學習架構,能針對不同系統複雜度與不同最佳化目標數量,提供有效的能源管理解決方案。
因此,本論文從四個面向探討強化式能源管理:
大規模智慧電表能源資料中的稀有事件偵測:為了減少能源浪費並維持住宅建築中的能源使用效率,我們主要分析能源智慧電表資料中,針對稀有但重要的能源使用行為進行偵測。本研究對應於來自電力與用水表的能源行為分析。為了能夠有效從大量時間序列資料中發掘此類行為,我們提出一種探索利用機制結合系統工作負載限制的框架。該方法透過接收各時間點的標記資訊,結合啟發式隨機游走,以捕捉行為的非平穩性,其作用類似於單智能體強化學習中獎勵機制來引導模型訓練。
多智能體控制下的電動車充電基礎設施配置:除了使用者端的充電排程,充電設施的配置亦需具備彈性以因應動態需求。本研究專注於充電基礎設施的能源管理。我們提出一個多智能體強化學習框架,能夠動態協調移動式充電站(MCS)與固定式充電站(FCS),以降低充電車使用者的充電花費時間並提升整體充電樁使用率。
基於誘因機制的電動車充電排程:隨著電動車使用量持續增加,如何兼顧使用者偏好的智慧充電排程變得日益重要。本研究對應於與充電基礎設施整合之商業建築中的能源管理。我們提出一個時間空間排程的框架,能即時調整排程策略,並透過誘因機制提升使用者接受度。與第一項研究類似,我們設計了一個單智能體 Q-learning方法來學習最佳排程策略。然而,相較於第一項研究,本研究考慮的是建築用電需求與電動車充電行為之間的整合情境。
階層式多智能體電動車充電控制:本研究同樣探討建築與充電基礎設施的整合,特別聚焦於辦公大樓應用場域。我們設計了一個階層式多智能體架構,其中每一個充電器皆被視為獨立的智能體。其目標為在考量電動車不確定的離開行為情況下,降低整體電費成本。為提升即時充電控制的穩健性,我們提出一種創新的評分(critic)增強機制,將電動車離開的不確定性納入強化學習中評分機制的過程中。
這四個研究面向展現了強化學習技術在不同系統複雜度與多樣能源應用場景中的應用潛力。總結來說,本論文致力於建築與電動車充電基礎設施中的強化式能源管理研究,包括單智能體與多智能體的最佳化框架。實驗結果顯示,強化學習與能源系統的整合,能顯著提升在動態環境中的營運效益與自適應能力。
The increasing complexity of energy systems, driven by the widespread deployment of smart meters and the growing adoption of electric vehicles (EVs), has made efficient energy management in buildings and charging infrastructures more critical in recent years. Traditional rule-based or static optimization methods are difficult to cope with the dynamic and uncertain nature of real-world energy environments, such as fluctuating energy demand, unpredictable user behavior, and dynamic electricity pricing. Reinforcement learning (RL), with its ability to make adaptive, data-driven decisions, has emerged as a powerful approach for optimizing energy use, detecting anomalies, and coordinating decentralized systems. In particular, both single-agent and multi-agent reinforcement learning frameworks offer effective solutions for energy management tasks, which can accommodate different levels of system complexity and optimization objectives across distributed infrastructures.
Therefore, this dissertation explores reinforced energy management from four perspectives:
Rare event detection in large-scale smart meter energy data: To reduce energy waste and maintain usage efficiency in residential and commercial buildings, we first address the detection of rare but critical events in energy consumption data collected from smart meters. This work corresponds to energy behavior analysis based on data from electricity and water meters. A novel explore–exploit strategy within a workload-bounded framework is proposed to efficiently detect such events from massive energy time series. A heuristic-based random walk is derived based on partial labels received at each time period to capture the non-stationarity of rare events, functioning similarly to a single-agent reinforcement learning reward mechanism to guide model training.
Multi-agent control for EV charging infrastructure placement: Beyond user-side scheduling, infrastructure deployment also requires adaptation. This work focuses on energy management for the charging infrastructure. We present a multi-agent reinforcement learning framework that dynamically coordinates Mobile Charging Stations (MCSs) with Fixed Charging Stations (FCSs) to minimize EV users' charging cost and optimize service coverage.
Incentive-aware EV charging scheduling: As EV usage surges, intelligent scheduling that respects user preferences becomes important. This work corresponds to energy management in commercial buildings with integrated charging infrastructure. We introduce a spatial-temporal online framework that incentivizes users and adapts in real-time to manage distributed charging demand effectively. Similar to the first work, we design a single-agent Q-learning approach to learn the optimal scheduling policy. However, unlike the first, it considers the integration between building energy needs and EV charging behaviors.
Hierarchical multi-agent EV charging control: This work also addresses the integration of building and charging infrastructure, with a specific focus on office buildings. A hierarchical multi-agent structure is designed, in which each charger is treated as an individual agent. The goal is to minimize electricity costs while considering the uncertainty of EV departures. To enhance the robustness of real-time charging control, a novel critic augmentation mechanism in reinforcement learning is introduced to incorporate departure uncertainties into the critic evaluation process.
Together, these four lines of work demonstrate the potential of reinforcement learning techniques across different system complexities and diverse energy usage scenarios. In summary, this dissertation focuses on reinforced energy management across buildings and EV charging infrastructures by exploring single-agent and multi-agent optimization frameworks. The experimental results demonstrate that integrating reinforcement learning with energy systems can significantly enhance operational effectiveness and adaptability in dynamic environments.
[1] D. C. Council. (2018) Electric vehicle charging sessions dundee. [Online]. Available: https://www.drivedundeeelectric.co.uk/
[2] C. Alaton, F. Tounquet et al., “Benchmarking smart metering deployment in the eu-28 final report,” Directorate-General for Energy (European Commission), Tractebel Impact, 2020.
[3] Laurence Goasduff. (2024) Gartner forecasts 85 million electric vehicles will be on the road by end of 2025. [Online]. Available: https://www.gartner.com/en/newsroom/press-releases/2024-10-14-gartner-forecasts-85-million-electric-vehicles-will-be-on-the-road-by-end-of-2025
[4] A. Silvestri, D. Coraci, S. Brandi, A. Capozzoli, E. Borkowski, J. Köhler, D. Wu, M. N. Zeilinger, and A. Schlueter, “Real building implementation of a deep reinforcement learning controller to enhance energy efficiency and indoor temperature control,” Applied Energy, vol. 368, p. 123447, 2024.
[5] X. Deng, Y. Zhang, Y. Jiang, and H. Qi, “A novel operation method for renewable building by combining distributed dc energy system and deep reinforcement learning,” Applied Energy, vol. 353, p. 122188, 2024.
[6] Q. Meng, S. Hussain, F. Luo, Z. Wang, and X. Jin, “An online reinforcement learning-based energy management strategy for microgrids with centralized control,” IEEE Transactions on Industry Applications, 2024.
[7] A. Ajagekar, B. Decardi-Nelson, and F. You, “Energy management for demand response in networked greenhouses with multi-agent deep reinforcement learning,” Applied Energy, vol. 355, p. 122349, 2024.
[8] A. Massaro, A. Panarese, and A. M. Galiano, “Technological platform for hydrogeological risk computation and water leakage detection based on a convolutional neural network,” 2021 IEEE International Workshop on MetroInd4.0&IoT, 2021.
[9] F. Ciancetta, G. Bucci, E. Fiorucci, S. Mari, and A. Fioravanti, “A new convolutional neural network-based system for nilm applications,” IEEE Trans. on Instrumentation and Measurement, 2021.
[10] B. Hou, C. Hou, T. Zhou, Z. Cai, and F. Liu, “Detection and characterization of network anomalies in large-scale rtt time series,” IEEE Trans. on Network and Service Management, 2021.
[11] W. Zhang and C. Challis, “Virtual-sre for monitoring large scale time-series data,” 2021 IEEE Big Data, 2021.
[12] A. Abdulaal, Z. Liu, and T. Lancewicki, “Practical approach to asynchronous multivariate time series anomaly detection and localization,” Proceedings of the 27th ACM SIGKDD, 2021.
[13] A. Roy and M. Law, “Examining spatial disparities in electric vehicle charging station placements using machine learning,” Sustainable Cities and Society, vol. 83, p. 103978, 2022.
[14] J. Tolbert, “Beyond cities: Breaking through barriers to rural electric vehicle adoption,” Environmental and Energy Study Institute. Retrieved April, vol. 11, p. 2022, 2021.
[15] L. Liu, Z. Xi, K. Zhu, R. Wang, and E. Hossain, “Mobile charging station placements in internet of electric vehicles: A federated learning approach,” IEEE T-ITS, vol. 23, 2022.
[16] F. Wang, R. Chen, L. Miao, P. Yang, and B. Ye, “Location optimization of electric vehicle mobile charging stations considering multi-period stochastic user equilibrium,” Sustainability, vol. 11, no. 20, p. 5841, 2019.
[17] F. Elghitani and E. F. El-Saadany, “Efficient assignment of electric vehicles to charging stations,” IEEE Transactions on Smart Grid, vol. 12, pp. 761–773, 2021.
[18] H. Lin, X. Lin, H. Labiod, and L. Chen, “Toward multiple-phase mdp model for charging station recommendation,” IEEE Transactions on Intelligent Transportation Systems, vol. 23, pp. 10 583–10 595, 2022.
[19] I. Sami, Z. Ullah, K. Salman, I. Hussain, S. Ali, B. Khan, C. Mehmood, and U. Farid, “A bidirectional interactive electric vehicles operation modes: Vehicle-to-grid (v2g) and grid-to-vehicle (g2v) variations within smart grid,” in 2019 international conference on engineering and emerging technologies (ICEET). IEEE, 2019, pp. 1–6.
[20] R. P. Upputuri and B. Subudhi, “A comprehensive review and performance evaluation of bidirectional charger topologies for v2g/g2v operations in ev applications,” IEEE Transactions on Transportation Electrification, vol. 10, no. 1, pp. 583–595, 2023.
[21] Z. M. Cinar, A. A. Nuhu, Q. Zeeshan, O. Korhan, M. Asmael, and B. Safaei, “Machine learning in predictive maintenance towards sustainable smart manufacturing in industry 4.0,” Sustainability, 2020. [Online]. Available: https://api.semanticscholar.org/CorpusID:225160331
[22] S.-C. Yip, W.-N. Tan, C. Tan, M.-T. Gan, and K. Wong, “An anomaly detection framework for identifying energy theft and defective meters in smart grids,” International Journal of Electrical Power & Energy Systems, 2018. [Online]. Available: https://api.semanticscholar.org/CorpusID:116787882
[23] H. M. Hussain, N. Javaid, S. Iqbal, Q. U. Hasan, K. Aurangzeb, and M. A. Alhussein, “An efficient demand side management system with a new optimized home energy management controller in smart grid,” Energies, vol. 11, p. 190, 2018. [Online]. Available: https://api.semanticscholar.org/CorpusID:116236677
[24] R. C. U. of Sydney, C. M. C. R. Centre, S. C. Q. C. R. Institute, and Hbku, “Deep learning for anomaly detection: A survey,” 2019.
[25] W. Stallings, Computer organization and architecture: Designing for performance, 2010.
[26] H. Jégou, M. Douze, and C. Schmid, “Product quantization for nearest neighbor search,” IEEE Trans. on Pattern Analysis and Machine Intelligence, 2011.
[27] M. Eliasof, E. Haber, and E. Treister, “pathgcn: Learning general graph spatial operators from paths,” in International Conference on Machine Learning, 2022. [Online]. Available: https://api.semanticscholar.org/CorpusID:250341039
[28] D. Jin, R. xia Wang, M. Ge, D. He, X. Li, W. Lin, and W. Zhang, “Rawgnn: Random walk aggregation based graph neural network,” in International Joint Conference on Artificial Intelligence, 2022. [Online]. Available: https://api.semanticscholar.org/CorpusID:250089356
[29] A. Grover and J. Leskovec, “Node2vec: Scalable feature learning for networks,” in Proc. of ACM SIGKDD, 2016.
[30] D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” CoRR, vol. abs/1412.6980, 2014.
[31] T. Mikolov, K. Chen, G. Corrado, and J. Dean, “Efficient estimation of word representations in vector space,” in Proc. of Workshop at ICLR, 2013.
[32] M. Gulati and P. Arjunan, “Lead1.0: a large-scale annotated dataset for energy anomaly detection in commercial buildings,” Proceedings of the Thirteenth ACM International Conference on Future Energy Systems, 2022. [Online]. Available: https://api.semanticscholar.org/CorpusID:247839323
[33] P. Boniol, M. Linardi, F. Roncallo, T. Palpanas, M. Meftah, and E. Remy, “Unsupervised and scalable subsequence anomaly detection in large data series,” VLDB J., 2021.
[34] H. Zhang, Y. Dong, J. Li, and D. Xu, “Dynamic time warping under product quantization, with applications to time-series data similarity search,” IEEE Internet of Things Journal, 2022.
[35] G. S. Na, D. H. Kim, and H. Yu, “Dilof: Effective and memory efficient local outlier detection in data streams,” Proceedings of the 24th ACM SIGKDD, 2018.
[36] M. Munir, S. A. Siddiqui, A. R. Dengel, and S. Ahmed, “Deepant: A deep learning approach for unsupervised anomaly detection in time series,” IEEE Access, 2019.
[37] F. Liu, Y. Yu, P. Song, Y. Fan, and X. Tong, “Scalable kde-based top-n local outlier detection over large-scale data streams,” Knowl. Based Syst., 2020.
[38] P. Boniol, J. Paparrizos, T. Palpanas, and M. J. Franklin, “Sand: Streaming subsequence anomaly detection,” Proc. VLDB Endow., vol. 14, pp. 1717–1729, 2021. [Online]. Available: https://api.semanticscholar.org/CorpusID:235677365
[39] Y. xin Zhang, Y. Chen, J. Wang, and Z. Pan, “Unsupervised deep anomaly detection for multi-sensor time-series signals,” IEEE Transactions on Knowledge and Data Engineering, vol. 35, pp. 2118–2132, 2021.
[40] J. Xu, H. Wu, J. Wang, and M. Long, “Anomaly transformer: Time series anomaly detection with association discrepancy,” in International Conference on Learning Representations, 2022. [Online]. Available: https://openreview.net/forum?id=LzQQ89U1qm
[41] S. Tuli, G. Casale, and N. R. Jennings, “Tranad: deep transformer networks for anomaly detection in multivariate time series data,” Proceedings of the VLDB Endowment, vol. 15, pp. 1201–1214, 02 2022.
[42] S. Han and S. S. Woo, “Learning sparse latent graph representations for anomaly detection in multivariate time series,” Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2022.
[43] Y. Li, X.-J. Peng, J. Zhang, Z. Li, and M. Wen, “Dct-gan: Dilated convolutional transformer-based gan for time series anomaly detection,” IEEE Transactions on Knowledge and Data Engineering, vol. 35, pp. 3632–3644, 2023.
[44] Y. Himeur, A. Alsalemi, F. Bensaali, and A. Amira, “Smart non-intrusive appliance identification using a novel local power histogramming descriptor with an improved k-nearest neighbors classifier,” Sustainable Cities and Society, 2021.
[45] ——, “Robust event-based non-intrusive appliance recognition using multi-scale wavelet packet tree and ensemble bagging tree,” Applied Energy, 2020.
[46] ——, “Effective non-intrusive load monitoring of buildings based on a novel multi-descriptor fusion with dimensionality reduction,” Applied Energy, 2020.
[47] Y. Zhang, Z. Zhang, Y. Zhang, J. Bao, Y. Zhang, and H. Deng, “Human activity recognition based on motion sensor using u-net,” IEEE Access, vol. 7, pp. 75213–75226, 2019.
[48] X. Zhou, W. Liang, K. I.-K. Wang, H. Wang, L. T. Yang, and Q. Jin, “Deep-learning-enhanced human activity recognition for internet of healthcare things,” IEEE Internet of Things Journal, 2020.
[49] G. Zheng, “A novel attention-based convolution neural network for human activity recognition,” IEEE Sensors Journal, vol. 21, pp. 27015–27025, 2021.
[50] M. Abdel-Basset, H. Hawash, R. K. Chakrabortty, M. J. Ryan, M. Elhoseny, and H. Song, “St-deephar: Deep learning model for human activity recognition in ioht applications,” IEEE Internet of Things Journal, vol. 8, pp. 4969–4979, 2021.
[51] L. Jin, X. Wang, J. Chu, and M. He, “Human activity recognition machine with an anchor-based loss function,” IEEE Sensors Journal, vol. 22, pp. 741–756, 2022.
[52] AIADA. (2024) 47 states fail to meet the ideal ratio of chargers to evs, report says. [Online]. Available: https://www.aiada.org/47-states-fail-to-meet-the-ideal-ratio-of-chargers-to-evs-report-says/
[53] HERE Technologies. (2023) Paving the way for a sustainable ev ecosystem. [Online]. Available: https://www.here.com/ev-index-2023
[54] P. Liu, C. Wang, J. Hu, T. Fu, N. Cheng, N. Zhang, and X. Shen, “Joint route selection and charging discharging scheduling of evs in v2g energy network,” IEEE Transactions on Vehicular Technology, vol. 69, no. 10, pp. 10630–10641, 2020.
[55] S. Wang, Z. Y. Dong, F. Luo, K. Meng, and Y. Zhang, “Stochastic collaborative planning of electric vehicle charging stations and power distribution system,” IEEE Transactions on Industrial Informatics, vol. 14, no. 1, pp. 321–331, 2017.
[56] Y. Zhao, Y. Guo, Q. Guo, H. Zhang, and H. Sun, “Deployment of the electric vehicle charging station considering existing competitors,” IEEE Transactions on Smart Grid, vol. 11, no. 5, pp. 4236–4248, 2020.
[57] H. Chen, Z. Su, Y. Hui, and H. Hui, “Optimal approach to provide electric vehicles with charging service by using mobile charging stations in heterogeneous networks,” in 2016 IEEE 84th Vehicular Technology Conference (VTC-Fall). IEEE, 2016, pp. 1–5.
[58] S. Afshar, P. Macedo, F. Mohamed, and V. Disfani, “Mobile charging stations for electric vehicles—a review,” Renewable and Sustainable Energy Reviews, vol. 152, p. 111654, 2021.
[59] L. P.-Y. Ting, C.-C. Lin, S.-H. Lin, Y.-L. Chu, and K.-T. Chuang, “Multi-agent reinforcement learning for online placement of mobile ev charging stations,” in Pacific-Asia Conference on Knowledge Discovery and Data Mining. Springer, 2024, pp. 284–296.
[60] L. von Wahl, N. Tempelmeier, A. Sao, and E. Demidova, “Reinforcement learning-based placement of charging stations in urban road networks,” ACM SIGKDD, 2022.
[61] Q. Liu, Y. Zeng, L. Chen, and X. Zheng, “Social-aware optimal electric vehicle charger deployment on road network,” ACM SIGSPATIAL, 2019.
[62] L. Yan, H. Shen, L. Kang, J. Zhao, and C. Xu, “Reinforcement learning based scheduling for cooperative ev-to-ev dynamic wireless charging,” IEEE MASS, 2020.
[63] S. Afshar and V. R. Disfani, “Optimal scheduling of electric vehicles in the presence of mobile charging stations,” IEEE PESGM, 2022.
[64] M. A. Beyazit and A. Taşcıkaraoğlu, “Optimal management of mobile charging stations in urban areas in a distribution network,” SEST, 2022.
[65] A. K. Aktar, A. Taşcıkaraoğlu, and J. P. S. Catalão, “Optimal charging and discharging operation of mobile charging stations,” SEST, 2022.
[66] I. El-Fedany, D. Kiouach, and R. Alaoui, “A smart coordination system integrates mcs to minimize ev trip duration and manage the ev charging, mainly at peak times,” International Journal of Intelligent Transportation Systems Research, vol. 19, no. 3, pp. 496–509, 2021.
[67] G. Cornuéjols, G. L. Nemhauser, and L. A. Wolsey, “The uncapacitated facility location problem,” 1990. [Online]. Available: https://api.semanticscholar.org/CorpusID:8880493
[68] K. Lin, R. Zhao, Z. Xu, and J. Zhou, “Efficient large-scale fleet management via multi-agent deep reinforcement learning,” ACM SIGKDD, 2018.
[69] P. Sunehag, G. Lever, A. Gruslys, W. M. Czarnecki, V. Zambaldi, M. Jaderberg, M. Lanctot, N. Sonnerat, J. Z. Leibo, K. Tuyls et al., “Value-decomposition networks for cooperative multi-agent learning based on team reward,” in Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems, 2018, pp. 2085–2087.
[70] M. A. Fernández, A. L. Zorita, L. A. García-Escudero, O. Duque, D. Morinigo, M. V. Riesco, and M. G. Munoz, “Cost optimization of electrical contracted capacity for large customers,” International Journal of Electrical Power & Energy Systems, vol. 46, pp. 123–131, 2013.
[71] A. Atputharajah and T. K. Saha, “Power system blackouts-literature review,” in 2009 International Conference on Industrial and Information Systems (ICIIS). IEEE, 2009, pp. 460–465.
[72] S. Rendle, C. Freudenthaler, Z. Gantner, and L. Schmidt-Thieme, “Bpr: Bayesian personalized ranking from implicit feedback,” ArXiv, vol. abs/1205.2618, 2009.
[73] W. L. Hamilton, Z. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” in NIPS, 2017.
[74] K. Xu, W. Hu, J. Leskovec, and S. Jegelka, “How powerful are graph neural networks?” in International Conference on Learning Representations, 2019. [Online]. Available: https://openreview.net/forum?id=ryGs6iA5Km
[75] B. Yang, W. tau Yih, X. He, J. Gao, and L. Deng, “Embedding entities and relations for learning and inference in knowledge bases,” CoRR, vol. abs/1412.6575, 2014.
[76] S. Dahi and S. Tabbane, “Sigmoid utility function formulation for handoff reducing access model in cognitive radio,” 2013 13th International Symposium on Communications and Information Technologies (ISCIT), pp. 166–170, 2013.
[77] H. Susanto, “From self-regulate to admission control in real-time traffic environment,” 2014 IEEE 28th International Conference on Advanced Information Networking and Applications, pp. 9–16, 2014.
[78] D. Shah and Q. Xie, “Q-learning with nearest neighbors,” in NeurIPS, 2018.
[79] C. Miller, “Enernoc commercial building dataset,” 2012, http://cargocollective.com/buildingdata/100-EnerNOC-Commercial-Buildings.
[80] D. Chen and D. E. Irwin, “Sundance: Black-box behind-the-meter solar disaggregation,” Proceedings of the Eighth International Conference on Future Energy Systems, 2017.
[81] Z. J. Lee, G. S. Lee, T. Lee, C. Jin, R. Lee, Z. Low, D. Chang, C. Ortega, and S. H. Low, “Adaptive charging networks: A framework for smart electric vehicle charging,” IEEE Transactions on Smart Grid, vol. 12, pp. 4339–4350, 2020.
[82] B. Rosado, R. Torquato, B. Venkatesh, H. B. Gooi, W. Freitas, and M. J. Rider, “Framework for optimizing the demand contracted by large customers,” IET Generation, Transmission & Distribution, 2020.
[83] B. Alinia, M. H. Hajiesmaili, Z. J. Lee, N. Crespi, and E. Mallada, “Online ev scheduling algorithms for adaptive charging networks with global peak constraints,” IEEE Transactions on Sustainable Computing, vol. 7, pp. 537–548, 2022.
[84] B. Alinia, M. H. Hajiesmaili, and N. Crespi, “Online ev charging scheduling with on-arrival commitment,” IEEE Transactions on Intelligent Transportation Systems, vol. 20, pp. 4524–4537, 2019.
[85] F. Technologies. (2020) What’s the difference between ev charging levels? [Online]. Available: https://freewiretech.com/difference-between-ev-charging-levels/
[86] S. Zhao, X. Lin, and M. Chen, “Peak-minimizing online ev charging: Price-of-uncertainty and algorithm robustification,” 2015 IEEE Conference on Computer Communications (INFOCOM), pp. 2335–2343, 2015.
[87] L. Guo, K. F. Eriksson, and S. H. Low, “Optimal online adaptive electric vehicle charging,” 2017 IEEE Power & Energy Society General Meeting, pp. 1–5, 2017.
[88] B. Sun, T. Li, S. H. Low, and D. H.-K. Tsang, “Orc: An online competitive algorithm for recommendation and charging schedule in electric vehicle charging network,” Proceedings of the Eleventh ACM International Conference on Future Energy Systems, 2020.
[89] Q. Lin, H. Yi, and M. Chen, “Minimizing cost-plus-dissatisfaction in online ev charging under real-time pricing,” IEEE Transactions on Intelligent Transportation Systems, vol. 23, pp. 12464–12479, 2022.
[90] H. Yi, Q. Lin, and M. Chen, “Balancing cost and dissatisfaction in online ev charging under real-time pricing,” IEEE INFOCOM 2019 - IEEE Conference on Computer Communications, pp. 1801–1809, 2019.
[91] B. Sun, A. Zeynali, T. Li, M. H. Hajiesmaili, A. Wierman, and D. H.-K. Tsang, “Competitive algorithms for the online multiple knapsack problem with application to electric vehicle charging,” Proceedings of the ACM on Measurement and Analysis of Computing Systems, vol. 4, pp. 1 – 32, 2020.
[92] Z. Tian, T. Jung, Y. Wang, F. Zhang, L. Tu, C.-Z. Xu, C. Tian, and X.-Y. Li, “Real-time charging station recommendation system for electric-vehicle taxis,” IEEE Transactions on Intelligent Transportation Systems, vol. 17, pp. 3098–3109, 2016.
[93] T. Guo, P. You, and Z. Yang, “Recommendation of geographic distributed charging stations for electric vehicles: A game theoretical approach,” 2017 IEEE Power & Energy Society General Meeting, pp. 1–5, 2017.
[94] Y. Cao, T. Wang, O. Kaiwartya, G. Min, N. Ahmad, and A. H. Abdullah, “An ev charging management system concerning drivers’ trip duration and mobility uncertainty,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 48, pp. 596–607, 2018.
[95] H.-M. Chung, W.-T. Li, C. Yuen, C.-K. Wen, and N. Crespi, “Electric vehicle charge scheduling mechanism to maximize cost efficiency and user convenience,” IEEE Transactions on Smart Grid, vol. 10, pp. 3020–3030, 2018.
[96] Y. Cao, O. Kaiwartya, Y. Zhuang, N. Ahmad, Y. L. Sun, and J. Lloret, “A decentralized deadline-driven electric vehicle charging recommendation,” IEEE Systems Journal, vol. 13, pp. 3410–3421, 2019.
[97] Y. Zheng, Y. Song, D. J. Hill, and K. Meng, “Online distributed mpc-based optimal scheduling for ev charging stations in distribution systems,” IEEE Transactions on Industrial Informatics, vol. 15, pp. 638–649, 2019.
[98] C. B. Saner, A. Trivedi, and D. Srinivasan, “A cooperative hierarchical multi-agent system for ev charging scheduling in presence of multiple charging stations,” IEEE Transactions on Smart Grid, vol. 13, no. 3, pp. 2218–2233, 2022.
[99] Coldwell Banker Richard Ellis Group Inc. (2024) EV adoption creates more demand for workplace charging stations. [Online]. Available: https://www.cbre.com/insights/articles/ev-adoption-creates-more-demand-for-workplace-charging-stations
[100] C. B. Saner, A. Trivedi, and D. Srinivasan, “A cooperative hierarchical multi-agent system for ev charging scheduling in presence of multiple charging stations,” IEEE Transactions on Smart Grid, vol. 13, no. 3, pp. 2218–2233, 2022.
[101] F. L. da Silva, C. E. H. Nishida, D. M. Roijers, and A. H. R. Costa, “Coordination of electric vehicle charging through multiagent reinforcement learning,” IEEE Transactions on Smart Grid, vol. 11, pp. 2347–2356, 2020.
[102] A. Visakh and M. P. Selvan, “Analysis and mitigation of the impact of electric vehicle charging on service disruption of distribution transformers,” Sustainable Energy, Grids and Networks, vol. 35, p. 101096, 2023.
[103] B. Rosado, R. Torquato, B. Venkatesh, H. B. Gooi, W. Freitas, and M. J. Rider, “Framework for optimizing the demand contracted by large customers,” IET Generation, Transmission & Distribution, vol. 14, no. 4, pp. 635–644, 2020.
[104] H. Li, G. Li, S. Li, B. Han, K. Wang, and J. Xu, “Optimal ev charging scheduling considering the lack of charging facilities based on deep reinforcement learning,” 2023 8th Asia Conference on Power and Electrical Engineering (ACPEE), pp. 1825–1829, 2023.
[105] H. Li, G. Li, T. T. Lie, X. Li, K. Wang, B. Han, and J. Xu, “Constrained large-scale real-time ev scheduling based on recurrent deep reinforcement learning,” International Journal of Electrical Power and Energy Systems, vol. 144, p. 108603, 2023.
[106] T. P. Lillicrap, J. J. Hunt, A. Pritzel, N. Heess, T. Erez, Y. Tassa, D. Silver, and D. Wierstra, “Continuous control with deep reinforcement learning,” in Proceedings of the 4th International Conference on Learning Representations (ICLR), 2016.
[107] R. Lowe, A. Tamar, J. Harb, O. Pieter Abbeel, and I. Mordatch, “Multi-agent actor-critic for mixed cooperative-competitive environments,” Advances in Neural Information Processing Systems, vol. 30, 2017.
[108] P. Auer, “Finite-time analysis of the multiarmed bandit problem,” 2002.
[109] M. Pipattanasomporn, G. Chitalia, J. Songsiri, et al., “Cu-bems, smart building electricity consumption and indoor environmental sensor datasets,” Sci Data, vol. 7, p. 241, 2020. [Online]. Available: https://doi.org/10.1038/s41597-020-00582-3
[110] ComEd, “Comed hourly pricing,” 2024, accessed: 2024-07-18. [Online]. Available: https://hourlypricing.comed.com/hp-api/
[111] H. Li, G. Li, T. T. Lie, X. Li, K. Wang, B. Han, and J. Xu, “Constrained large-scale real-time ev scheduling based on recurrent deep reinforcement learning,” International Journal of Electrical Power & Energy Systems, vol. 144, p. 108603, 2023.
[112] M. Tan, “Multi-agent reinforcement learning: Independent versus cooperative agents,” in International Conference on Machine Learning, 1997.
[113] Z. Zhang, W. Li, and X. Li, “Electric vehicle load optimization model considering peak and valley electricity price time,” 2023 IEEE 2nd International Conference on Electrical Engineering, Big Data and Algorithms (EEBDA), pp. 54–58, 2023.
[114] L. P.-Y. Ting, H.-Y. Wang, J.-Y. Jhang, and K.-T. Chuang, “Online spatial-temporal ev charging scheduling with incentive promotion,” ACM Transactions on Intelligent Systems and Technology, vol. 15, no. 5, pp. 1–26, 2024.
[115] A. Selim, M. Abdel-Akher, S. Kamel, F. Jurado, and S. A. Almohaimeed, “Electric vehicles charging management for real-time pricing considering the preferences of individual vehicles,” Applied Sciences, vol. 11, no. 14, p. 6632, 2021.
[116] L. D. A. Bitencourt, B. S. Borba, R. S. Maciel, M. Z. Fortes, and V. H. Ferreira, “Optimal ev charging and discharging control considering dynamic pricing,” in 2017 IEEE Manchester PowerTech. IEEE, 2017, pp. 1–6.
[117] Y. Guo, J. Xiong, S. Xu, and W. Su, “Two-stage economic operation of microgrid-like electric vehicle parking deck,” IEEE Transactions on Smart Grid, vol. 7, pp. 1703–1712, 2016.
[118] J. Zhao, C. Wan, Z. Xu, and J. Wang, “Risk-based day-ahead scheduling of electric vehicle aggregator using information gap decision theory,” IEEE Transactions on Smart Grid, vol. 8, pp. 1609–1618, 2017.
[119] K. Chopra, M. Kumar Shah, and K. R. Niazi, “Cost benefit analysis for electric vehicle charging infrastructure in parking lot,” 2023 IEEE IAS Global Conference on Renewable Energy and Hydrogen Technologies (GlobConHT), pp. 1–5, 2023.
[120] Y. Chen, X. Wang, J. Wang, T. Qian, and Q. Peng, “Power control strategy of battery energy storage system participating in power system peak load shifting,” 2020 5th Asia Conference on Power and Electrical Engineering (ACPEE), pp. 710–715, 2020.
[121] H. Li, Z. Wan, and H. He, “Constrained ev charging scheduling based on safe deep reinforcement learning,” IEEE Transactions on Smart Grid, vol. 11, pp. 2427–2439, 2020.
[122] N. Mhaisen, N. Fetais, and A. M. Massoud, “Real-time scheduling for electric vehicles charging/discharging using reinforcement learning,” 2020 IEEE International Conference on Informatics, IoT, and Enabling Technologies (ICIoT), pp. 1–6, 2020.
[123] F. Zhang, Q. Yang, and D. An, “Cddpg: A deep-reinforcement-learning-based approach for electric vehicle charging control,” IEEE Internet of Things Journal, vol. 8, pp. 3075–3087, 2021.
[124] S. Wang, S. Bi, and Y. A. Zhang, “Reinforcement learning for real-time pricing and scheduling control in ev charging stations,” IEEE Transactions on Industrial Informatics, vol. 17, no. 2, pp. 849–859, 2019.
[125] L. Yan, X. Chen, Y. Chen, and J. Wen, “A cooperative charging control strategy for electric vehicles based on multiagent deep reinforcement learning,” IEEE Transactions on Industrial Informatics, vol. 18, pp. 8765–8775, 2022.
[126] J. Zhang, Y. Guan, L. Che, and M. Shahidehpour, “Ev charging command fast allocation approach based on deep reinforcement learning with safety modules,” IEEE Transactions on Smart Grid, vol. 15, pp. 757–769, 2024.