Reinforcement Learning-Based Energy Management for Industrial Park with Heterogeneous Batteries under Demand Response
Meng Yuan, Tinghui Yan, Zhezhuang Xu
Abstract
The integration of photovoltaic (PV) systems, stationary energy storage systems (ESSs), and electric vehicles (EVs) alongside demand response (DR) programmes in industrial parks presents opportunities to reduce costs and improve renewable energy utilisation. Coordinating these resources is challenging because office and production zones have distinct operational objectives, and battery ageing costs are often ignored. This paper proposes a DR-based energy management framework that jointly optimises grid interaction costs, thermal comfort, EV departure state-of-charge requirements, carbon emissions, and battery ageing. We model heterogeneous load characteristics using a dynamic energy distribution ratio and incorporate dispatch-level ageing models for both ESS and EV batteries. The problem is formulated as a Markov decision process (MDP) and solved with a deep deterministic policy gradient (DDPG) algorithm. High-fidelity simulations using data from a practical industrial park in China show the framework maintains indoor comfort while significantly reducing total operating costs, yielding savings of 44.58\% and 40.68\% compared with a rule-based DR strategy and a conventional time-of-use arbitrage approach, respectively.
