TY - JOUR
T1 - Emergency load shedding strategy for high renewable energy penetrated power systems based on deep reinforcement learning
AU - Chen, Hongwei
AU - Zhuang, Junzhi
AU - Zhou, Gang
AU - Wang, Yuwei
AU - Sun, Zhenglong
AU - Levron, Yoash
N1 - Publisher Copyright:
© 2023 The Author(s)
PY - 2023/4
Y1 - 2023/4
N2 - Traditional event-driven emergency load shedding determines the quantitative strategy by simulation of a specific set of expected faults, which requires high model accuracy and operation mode matching. However, due to the model complexity of renewable power generators and fluctuating power generation, traditional event-driven load shedding strategy faces the risk of mismatching in high renewable energy penetrated power systems. To address these challenges, this paper proposes an emergency load shedding method based on data-driven strategies and deep reinforcement learning (RL). Firstly, the reason for the possible mismatch of the event-driven load shedding strategy in the renewable power system is analyzed, and a typical mismatch scenario is constructed. Then, the emergency load shedding strategy is transformed into a Markov Decision Process (MDP), and the decision process's action space, state space, and reward function are designed. On this basis, an emergency control strategy platform based on the Gym framework is established for application of deep reinforcement learning in the power system emergency control strategy. In order to enhance the adaptability and efficiency of the RL intelligence agent to multi-fault scenarios, the Proximal Policy Optimization (PPO) is adopted to optimize the constructed MDP. Finally, the proposed reinforcement learning-based emergency load shedding strategy is trained and verified through a modified IEEE 39-bus system. The results show that the proposed strategy can effectively make correct strategies to restore system frequency in the event-driven load shedding mismatch scenario, and have good adaptability for different faults and operation scenarios.
AB - Traditional event-driven emergency load shedding determines the quantitative strategy by simulation of a specific set of expected faults, which requires high model accuracy and operation mode matching. However, due to the model complexity of renewable power generators and fluctuating power generation, traditional event-driven load shedding strategy faces the risk of mismatching in high renewable energy penetrated power systems. To address these challenges, this paper proposes an emergency load shedding method based on data-driven strategies and deep reinforcement learning (RL). Firstly, the reason for the possible mismatch of the event-driven load shedding strategy in the renewable power system is analyzed, and a typical mismatch scenario is constructed. Then, the emergency load shedding strategy is transformed into a Markov Decision Process (MDP), and the decision process's action space, state space, and reward function are designed. On this basis, an emergency control strategy platform based on the Gym framework is established for application of deep reinforcement learning in the power system emergency control strategy. In order to enhance the adaptability and efficiency of the RL intelligence agent to multi-fault scenarios, the Proximal Policy Optimization (PPO) is adopted to optimize the constructed MDP. Finally, the proposed reinforcement learning-based emergency load shedding strategy is trained and verified through a modified IEEE 39-bus system. The results show that the proposed strategy can effectively make correct strategies to restore system frequency in the event-driven load shedding mismatch scenario, and have good adaptability for different faults and operation scenarios.
KW - Deep reinforcement learning
KW - Design of decision space
KW - Emergency load shedding
KW - Mismatch scenario
UR - http://www.scopus.com/inward/record.url?scp=85150197734&partnerID=8YFLogxK
U2 - 10.1016/j.egyr.2023.03.027
DO - 10.1016/j.egyr.2023.03.027
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85150197734
SN - 2352-4847
VL - 9
SP - 434
EP - 443
JO - Energy Reports
JF - Energy Reports
ER -