Energy Storage Scheduling for Cost Minimization Using Deep Q-Learning

Research output: Chapter in Book/Report/Conference proceedingsConference proceedingpeer-review

Abstract

This paper presents a Discrete-Action Deep Q-Network (DQN) approach for scheduling Energy Storage Systems (ESS) to optimize energy costs for residential households. The proposed model normalizes the state based on the maximum allowable actions defined by the policy, enabling effective state representation and decision-making. Simulation results validate the superiority of the proposed method, which achieves cost savings of 43% compared to a baseline with no rule-based strategy and 21% more than the average rule-based approach. Additionally, the proposed discrete-action DQN approach demonstrates an 11% improvement in cost efficiency over a three-level scheduling DQN method, highlighting the benefits of action discretization. These findings underscore the potential of the proposed approach to enhance energy management in smart grids, effectively optimizing storage system contributions for residential households while significantly reducing energy costs.

Original languageEnglish
Title of host publication2025 IEEE Kiel PowerTech, PowerTech 2025
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798331543976
DOIs
Publication statusPublished - 2025
Event2025 IEEE Kiel PowerTech, PowerTech 2025 - Kiel, Germany
Duration: 29 Jun 20253 Jul 2025

Publication series

Name2025 IEEE Kiel PowerTech, PowerTech 2025

Conference

Conference2025 IEEE Kiel PowerTech, PowerTech 2025
Country/TerritoryGermany
CityKiel
Period29/06/253/07/25

Keywords

  • Deep Q-Learning
  • Energy Management
  • Optimization
  • Reinforcement Learning

Fingerprint

Dive into the research topics of 'Energy Storage Scheduling for Cost Minimization Using Deep Q-Learning'. Together they form a unique fingerprint.

Cite this