LDformer: a parallel neural network model for long-term power forecasting

Ran TIAN, Xinmei LI, Zhongyu MA, Yanxing LIU, Jingxia WANG, Chu WANG

PDF(1854 KB)
PDF(1854 KB)
Front. Inform. Technol. Electron. Eng ›› 2023, Vol. 24 ›› Issue (9) : 1287-1301. DOI: 10.1631/FITEE.2200540
Orginal Article
Orginal Article

LDformer: a parallel neural network model for long-term power forecasting

Author information +
History +

Abstract

Accurate long-term power forecasting is important in the decision-making operation of the power grid and power consumption management of customers to ensure the power system's reliable power supply and the grid economy's reliable operation. However, most time-series forecasting models do not perform well in dealing with long-time-series prediction tasks with a large amount of data. To address this challenge, we propose a parallel time-series prediction model called LDformer. First, we combine Informer with long short-term memory (LSTM) to obtain deep representation abilities in the time series. Then, we propose a parallel encoder module to improve the robustness of the model and combine convolutional layers with an attention mechanism to avoid value redundancy in the attention mechanism. Finally, we propose a probabilistic sparse (ProbSparse) self-attention mechanism combined with UniDrop to reduce the computational overhead and mitigate the risk of losing some key connections in the sequence. Experimental results on five datasets show that LDformer outperforms the state-of-the-art methods for most of the cases when handling the different long-time-series prediction tasks.

Keywords

Long-term power forecasting / Long short-term memory (LSTM) / UniDrop / Self-attention mechanism

Cite this article

Download citation ▾
Ran TIAN, Xinmei LI, Zhongyu MA, Yanxing LIU, Jingxia WANG, Chu WANG. LDformer: a parallel neural network model for long-term power forecasting. Front. Inform. Technol. Electron. Eng, 2023, 24(9): 1287‒1301 https://doi.org/10.1631/FITEE.2200540

RIGHTS & PERMISSIONS

2023 Zhejiang University Press
PDF(1854 KB)

Accesses

Citations

Detail

Sections
Recommended

/