Open Access
ARTICLE
Real-Time Demand Response Management for Controlling Load Using Deep Reinforcement Learning
1 Department of Computer Engineering, Chonnam National University, Yeosu, 59626, Korea
* Corresponding Author: Chang Gyoon Lim. Email:
Computers, Materials & Continua 2022, 73(3), 5671-5686. https://doi.org/10.32604/cmc.2022.027443
Received 18 January 2022; Accepted 12 April 2022; Issue published 28 July 2022
Abstract
With the rapid economic growth and improved living standards, electricity has become an indispensable energy source in our lives. Therefore, the stability of the grid power supply and the conservation of electricity is critical. The following are some of the problems facing now: 1) During the peak power consumption period, it will pose a threat to the power grid. Enhancing and improving the power distribution infrastructure requires high maintenance costs. 2) The user's electricity schedule is unreasonable due to personal behavior, which will cause a waste of electricity. Controlling load as a vital part of incentive demand response (DR) can achieve rapid response and improve demand-side resilience. Maintaining load by manually formulating rules, some devices are selective to be adjusted during peak power consumption. However, it is challenging to optimize methods based on manual rules. This paper uses Soft Actor-Critic (SAC) as a control algorithm to optimize the control strategy. The results show that through the coordination of the SAC to control load in CityLearn, realizes the goal of reducing both the peak load demand and the operation costs on the premise of regulating voltage to the safe limit.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.