Deep learning-based segmentation and detection of tunnel lining defects and components from GPR images using T-GPRMask
Jiahao Li , Hehua Zhu , Mei Yin
Underground Space ›› 2025, Vol. 25 ›› Issue (6) : 281 -294.
Ground penetrating radar (GPR) has been extensively applied in tunnel engineering for the non-destructive assessment of lining structures. However, the interpretation of GPR images remains a time-consuming and expertise-dependent task. To address this challenge, this study proposes tunnel ground-penetrating radar mask region-based convolutional neural network (T-GPRMask), a deep learning-based instance segmentation model designed for the automated detection of tunnel lining defects and components. By integrating a convolutional block attention module (CBAM) and feature pyramid network (FPN), T-GPRMask enhances multi-scale feature extraction, enabling the detection of small, low-contrast defects that are commonly encountered in GPR images. The model was pretrained on a domain-specific dataset containing a diverse set of GPR images related to underground structures and then fine-tuned on a dataset specifically designed for tunnel inspections. The model achieved recognition accuracies of 83.18%, 88.24%, 92.84%, and 91.56% for detecting poor compactness, voids, steel arch supports, and initial lining thickness, respectively. A comparative study further demonstrated T-GPRMask’s superior performance over traditional models, such as YOLOv7 and RetinaNet. Field experiments on real-world tunnel inspection data validated the model’s high spatial accuracy and highlighted its practical applicability in tunnel maintenance.
Ground penetrating radar / Tunnel lining inspection / Instance segmentation / Deep learning / Transfer learning
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
|
| [33] |
|
| [34] |
|
| [35] |
|
| [36] |
|
| [37] |
|
| [38] |
|
| [39] |
|
/
| 〈 |
|
〉 |