GMLAN: Grouped-residual and multi-scale large-kernel attention network for seismic image super-resolution
Anxin Zhang , Zhenbo Guo , Shiqi Dong , Zhiqi Wei
Journal of Seismic Exploration ›› 2025, Vol. 34 ›› Issue (5) : 36 -52.
GMLAN: Grouped-residual and multi-scale large-kernel attention network for seismic image super-resolution
The resolution of seismic images significantly impacts the accuracy of subsequent seismic interpretation and reservoir location. However, the resolution of seismic images often degrades due to the influence of multiple factors, making super-resolution of seismic images essential and critical. We propose a grouped-residual and multi-scale large-kernel attention network (GMLAN) framework, trained on synthetic seismic images to achieve excellent seismic image super-resolution on field seismic data. GMLAN is primarily composed of two modules: The feature extraction module (FEM) and the image reconstruction module (IRM). The FEM consists of two components: Shallow feature extraction (SFE) and deep feature extraction (DFE). The SFE component is designed to capture the basic information of seismic images, such as large-scale structures and morphological features of the strata. The DFE component serves as the cornerstone of the feature extraction process, leveraging residual groups and multi-scale large-kernel attention to distill detailed features from seismic images, such as stratigraphic interfaces, dip angles, and relative amplitudes. Finally, the IRM utilizes sub-pixel convolution, a learnable upsampling technique, to reconstruct super-resolution seismic images while preserving the continuity of seismic features. The framework demonstrates satisfactory performance on both synthetic and field data.
Seismic images / Super-resolution / Deep learning / Grouped-residual structures / Malti-scale large-kernel self-attention
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
/
| 〈 |
|
〉 |