Optimizing knowledge distillation for efficient breast ultrasound image segmentation: Insights and performance enhancement
Bahareh Behboodi , Rupert Brooks , Hassan Rivaz
Artificial Intelligence in Health ›› 2025, Vol. 2 ›› Issue (2) : 73 -86.
Optimizing knowledge distillation for efficient breast ultrasound image segmentation: Insights and performance enhancement
Most modern models designed for ultrasound (US) image segmentation are characterized by high computational and memory requirements, limiting their practical utility in point-of-care US settings. Consequently, researchers have devised innovative approaches to compress these large models, enabling the training of smaller networks capable of achieving comparable generalization performance. Among these strategies, knowledge distillation (KD) has emerged as particularly suitable for scenarios involving small datasets or where significant efficiency improvements are desired. While previous KD-based methods have focused on extracting comprehensive information from diverse levels of teacher representation, they often overlook the identification of the most effective representation level. Additionally, many existing techniques propose intricate strategies that present implementation challenges. To address this gap, our study concentrates on selecting optimal teacher representations from various levels. Through an exhaustive analysis of KD pathways, loss functions, and the impact of augmentation, we offer valuable insights into the mechanisms underlying knowledge transfer from the teacher to the student networks. Our proposed methodology significantly enhances student performance, elevating the Dice similarity score from 73% to 80%, while the teacher model achieves 81%. Notably, our student model achieves this improvement with only 0.82 million parameters, compared to the teacher model’s 96 million parameters.
Ultrasound / Image segmentation / Model compression / Knowledge distillation
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
|
| [33] |
|
| [34] |
|
| [35] |
|
| [36] |
|
| [37] |
|
| [38] |
|
| [39] |
|
| [40] |
|
| [41] |
|
| [42] |
|
| [43] |
|
| [44] |
|
| [45] |
|
| [46] |
|
| [47] |
|
| [48] |
|
| [49] |
|
| [50] |
|
| [51] |
|
| [52] |
|
| [53] |
|
| [54] |
|
| [55] |
|
| [56] |
|
| [57] |
|
| [58] |
|
| [59] |
|
| [60] |
|
| [61] |
|
| [62] |
|
| [63] |
|
| [64] |
|
| [65] |
|
| [66] |
|
| [67] |
|
| [68] |
|
| [69] |
|
| [70] |
|
| [71] |
|
| [72] |
|
| [73] |
|
| [74] |
|
| [75] |
|
| [76] |
|
| [77] |
|
| [78] |
|
| [79] |
|
| [80] |
|
/
| 〈 |
|
〉 |