LDformer: a parallel neural network model for long-term power forecasting
Ran TIAN, Xinmei LI, Zhongyu MA, Yanxing LIU, Jingxia WANG, Chu WANG
LDformer: a parallel neural network model for long-term power forecasting
Accurate long-term power forecasting is important in the decision-making operation of the power grid and power consumption management of customers to ensure the power system's reliable power supply and the grid economy's reliable operation. However, most time-series forecasting models do not perform well in dealing with long-time-series prediction tasks with a large amount of data. To address this challenge, we propose a parallel time-series prediction model called LDformer. First, we combine Informer with long short-term memory (LSTM) to obtain deep representation abilities in the time series. Then, we propose a parallel encoder module to improve the robustness of the model and combine convolutional layers with an attention mechanism to avoid value redundancy in the attention mechanism. Finally, we propose a probabilistic sparse (ProbSparse) self-attention mechanism combined with UniDrop to reduce the computational overhead and mitigate the risk of losing some key connections in the sequence. Experimental results on five datasets show that LDformer outperforms the state-of-the-art methods for most of the cases when handling the different long-time-series prediction tasks.
Long-term power forecasting / Long short-term memory (LSTM) / UniDrop / Self-attention mechanism
/
〈 | 〉 |