Teng Jing, JiangYajun, Shi Ruifeng, Jia Limin
To tackle the limitations of conventional wind power prediction methods in capturing non-stationary and nonlinear features of wind power sequences in the frequency domain and underutilization of spatial interrelations among wind turbines, we propose a novel network named DSTNet. This network integrates advanced signal decomposition technology for frequency enhancement while simultaneously incorporating both temporal and spatial information, enabling highly accurate ultra-short-term wind power forecasting. In terms of temporal information processing, discrete cosine transform is used to convert the wind power sequence from the time domain to the frequency domain, followed by frequency enhancement through a channel attention mechanism. To fully capture temporal dependencies, a decoder is employed to extract the intricate temporal features from the enhanced frequency signals. On the spatial front, a graph neural network model is employed, constructed based on the geographical layout of wind turbines within the farm. This GNN captures spatial features by modeling the relationships between each turbine node and its neighbors. Finally, the temporal and spatial features are fused to generate ultra-short-term wind power predictions. By seamlessly combining these temporal and spatial features, DSTNet enables superior ultra-short-term wind power forecasting, spanning prediction horizons of 10 minutes, 1 hour, and 4 hours. Our method is evaluated on the spatial dynamic wind power prediction dataset released by Baidu KDD CUP, and the results show that DSTNet outperforms all other methods in terms of both prediction accuracy and stability. Specifically, compared to the second-best method, the mean absolute error (MAE) is reduced by 29.75%, 19.11%, and 8.09%, respectively, and the mean square error (MSE) is reduced by 28.22%, 13.44%, and 6.96%, respectively. Furthermore, in terms of prediction stability, the coefficient of determination (R2) is increased by 1.78%, 1.68%, and 2.41%, respectively.