Title :
Structure-aware stochastic load management in smart grids
Author :
Yu Zhang ; Van der Schaar, Mihaela
Author_Institution :
Dept. of Electr. Eng., Univ. of California, Los Angeles, Los Angeles, CA, USA
fDate :
April 27 2014-May 2 2014
Abstract :
Load management based on dynamic pricing has been advocated as a key approach for demand-side management in smart grids. By appropriately pricing energy, economic incentives are given to consumers to shift their usage away from peak hours, thereby limiting the amount of energy that needs to be produced. However, traditional pricing-based load management methods usually rely on the assumption that the statistics of the system dynamics (e.g. the time-varying electricity price, the arrival distribution of consumers´ load demands) are known a priori, which is not true in practice. In this paper, we propose a novel price-dependent load scheduling algorithm which, unlike previous works, can operate optimally in systems where such statistical knowledge is unknown. We consider a power grid system where each consumer is equipped with an energy storage device that has the capability of storing electrical energy during peak hours. Specifically, we allow each consumer to proactively determine the amount of energy to purchase from the utility companies (or energy producers) while taking into consideration that its load demand and the electricity price dynamically vary over time in an a priori unknown manner. We first assume that all the dynamics are known and formulate the real-time load scheduling as a Markov decision process and systematically unravel the structural properties exhibited by the resulting optimal load scheduling policy. By utilizing these structural properties, we then prove that our proposed load scheduling algorithm can learn the system dynamics in an online manner and converge to the optimal solution. A distinctive feature of our algorithm is that it actively exploits partial information about the system dynamics so that less information needs to be learned than when using conventional reinforcement learning methods, which significantly improves the adaptation speed and the runtime performance. Our simulation results demonstrate that the proposed load sched- ling algorithm achieves efficiency by more than 30% compared to existing state-of-the-art online learning algorithms.
Keywords :
Markov processes; demand side management; energy storage; learning (artificial intelligence); power engineering computing; power system economics; pricing; smart power grids; Markov decision process; adaptation speed improvement; demand-side management; dynamic pricing; economic incentives; energy pricing; energy storage device; optimal load scheduling policy; power grid system; price-dependent load scheduling algorithm; reinforcement learning methods; runtime performance improvement; smart grids; structure-aware stochastic load management; system dynamics; utility companies; Dynamic scheduling; Electricity; Energy storage; Heuristic algorithms; Power system dynamics; Pricing; Smart grids;
Conference_Titel :
INFOCOM, 2014 Proceedings IEEE
Conference_Location :
Toronto, ON
DOI :
10.1109/INFOCOM.2014.6848212