F-Net: breast cancerous lesion region segmentation based on improved U-Net

Xiangyu Deng , Lihao Pan , Zhiyan Dang

Optoelectronics Letters ›› 2025, Vol. 21 ›› Issue (12) : 761 -768.

PDF
Optoelectronics Letters ›› 2025, Vol. 21 ›› Issue (12) :761 -768. DOI: 10.1007/s11801-025-4182-x
Article
research-article

F-Net: breast cancerous lesion region segmentation based on improved U-Net

Author information +
History +
PDF

Abstract

In order to solve the challenge of breast cancer region segmentation, we improved the U-Net. The convolutional block attention module with prioritized attention (CBAM-PA) and dilated transformer (Dformer) modules were designed to replace the convolutional layers at the encoding side in the base U-Net, the input logic of the U-Net was improved by dynamically adjusting the input size of each layer, and the short connections in the U-Net were replaced with crosslayer connections to enhance the image restoration capability at the decoding side. On the breast ultrasound images (BUSI) dataset, we obtain a Dice coefficient of 0.803 1 and an intersection-over-union (IoU) value of 0.736 2. The experimental results show that the proposed enhancement method effectively improves the accuracy and quality of breast cancer lesion region segmentation.

Keywords

A

Cite this article

Download citation ▾
Xiangyu Deng, Lihao Pan, Zhiyan Dang. F-Net: breast cancerous lesion region segmentation based on improved U-Net. Optoelectronics Letters, 2025, 21(12): 761-768 DOI:10.1007/s11801-025-4182-x

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Burt J R, Torosdagli N, Khosravan N, et al.. Deep learning beyond cats and dogs: recent advances in diagnosing breast cancer with deep neural networks. The British journal of radiology, 2018, 91(1089): 20170545 J]

[2]

Meraj T, Alosaimi W, Alouffi B, et al.. A quantization assisted U-Net study with ICA and deep features fusion for breast cancer identification using ultrasonic data. PeerJ computer science, 2021, 7: e805 J]

[3]

Xian M, Zhang Y, Cheng H D, et al.. Automatic breast ultrasound image segmentation: a survey. Pattern recognition, 2018, 79: 340-355 J]

[4]

Wells P N T, Halliwell M. Speckle in ultrasonic imaging. Ultrasonics, 1981, 19(5225-229 J]

[5]

Anderson B O, Braun S, Lim S, et al.. Early detection of breast cancer in countries with limited resources. The breast journal, 2003, 9: S51-S59 J]

[6]

Chen W, Giger M L, Bick U. A fuzzy C-means (FCM)-based approach for computerized segmentation of breast lesions in dynamic contrast-enhanced MR images. Academic radiology, 2006, 13(1): 63-72 J]

[7]

Horsch K, Giger M L, Venta L A, et al.. Automatic segmentation of breast lesions on ultrasound. Medical physics, 2001, 28(8): 1652-1659 J]

[8]

Horsch K, Giger M L, Venta L A, et al.. Computerized diagnosis of breast lesions on ultrasound. Medical physics, 2002, 29(2): 157-164 J]

[9]

Horsch K, Giger M L, Vyborny C J, et al.. Performance of computer-aided diagnosis in the interpretation of lesions on breast sonography. Academic radiology, 2004, 11(3): 272-280 J]

[10]

Xian M, Zhang Y, Cheng H D. Fully automatic segmentation of breast ultrasound images based on breast characteristics in space and frequency domains. Pattern recognition, 2015, 48(2): 485-497 J]

[11]

Moon W K, Lo C M, Chen R T, et al.. Tumor detection in automated breast ultrasound images using quantitative tissue clustering. Medical physics, 2014, 41(4): 042901 J]

[12]

Ronneberger O, Fischer P, Brox T. U-net: convolutional networks for biomedical image segmentation. 18th International Conference on Medical Image Computing and Computer-Assisted Intervention, October 5–9, 2015, Munich, Germany, 2015, Heidelberg, Springer International Publishing234-241[C]

[13]

OKTAY O, SCHLEMPER J, FOLGOC L L, et al. Attention U-Net: learning where to look for the pancreas[EB/OL]. (2018-04-11) [2024-12-23]. https://arxiv.org/abs/1804.03999.

[14]

Zhou Z, Rahman S M M, Tajbakhsh N, et al.. UNet++: a nested U-net architecture for medical image segmentation. Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support, September 20, 2018, Granada, Spain, 2018, Heidelberg, Springer International Publishing311[C]

[15]

MA T, WANG K, HU F. LMU-Net: lightweight U-shaped network for medical image segmentation[J]. Medical & biological engineering & computing, 2023: 1–10.

[16]

Zhang X, Cao X, Wang J, et al.. G-UNeXt: a lightweight MLP-based network for reducing semantic gap in medical image segmentation. Multimedia systems, 2023, 29(6): 3431-3446 J]

[17]

Zhai D, Hu B, Gong X, et al.. ASS-GAN: asymmetric semi-supervised GAN for breast ultrasound image segmentation. Neurocomputing, 2022, 493: 204-216 J]

[18]

RAUTELA K, KUMAR D, KUMAR V. Improved GAN for image resolution enhancement using ViT for breast cancer detection[J]. International journal of imaging systems and technology, 2024, 34(2).

[19]

Ma Z, Qi Y, Xu C, et al.. ATFE-Net: axial transformer and feature enhancement-based CNN for ultrasound breast mass segmentation. Computers in biology and medicine, 2023, 153: 106533 J]

[20]

Yang H, Yang D. CSwin-PNet: A CNN-Swin transformer combined pyramid network for breast lesion segmentation in ultrasound images. Expert systems with applications, 2023, 213: 119024 J]

[21]

Woo S, Park J, Lee J Y, et al.. CBAM: convolutional block attention module. Proceedings of the European Conference on Computer Vision, June 18–21, 2018, Salt Lake City, USA, 2018, Heidelberg, Springer International Publishing3-19[C]

[22]

Guo M H, Lu C Z, Hou Q, et al.. Segnext: rethinking convolutional attention design for semantic segmentation. Advances in neural information processing systems, 2022, 35: 1140-1156[J]

[23]

Xiong G, Xiong Z, Jia L, et al.. Spatial multifractal spectrum distribution method for breast ultrasonic image classification. Chaos, solitons & fractals, 2023, 172: 113530 J]

[24]

Liu J, Ren P, Lyu X, et al.. A Gamma-Log net for oil spill detection in inhomogeneous SAR images. Remote sensing, 2022, 14(16): 4074 J]

[25]

Ronneberger O, Fischer P, Brox T. U-net: convolutional networks for biomedical image segmentation. 18th International Conference on Medical Image Computing and Computer-Assisted Intervention, October 5–9, 2015, Munich, Germany, 2015, Heidelberg, Springer International Publishing234-241[C]

[26]

Ma J, Wu F, Jiang T, et al.. Ultrasound image-based thyroid nodule automatic segmentation using convolutional neural networks. International journal of computer assisted radiology and surgery, 2017, 12: 1895-1910 J]

[27]

Badrinarayanan V, Kendall A, Cipolla R. Segnet: a deep convolutional encoder-decoder architecture for image segmentation. IEEE transactions on pattern analysis and machine intelligence, 2017, 39(122481-2495 J]

[28]

Pan H, Zhou Q, Latecki L J. SGUnet: Semantic guided UNet for thyroid nodule segmentation. 2021 IEEE 18th International Symposium on Biomedical Imaging, April 13–16, 2021, Virtual, 2021, New York, IEEE630634[C]

[29]

Gong H, Chen J, Chen G, et al.. Thyroid region prior guided attention for ultrasound segmentation of thyroid nodules. Computers in biology and medicine, 2023, 155: 106389 J]

[30]

DOSOVITSKIY A, BEYER L, KOLESNIKOV A, et al. An image is worth 16x16 words: transformers for image recognition at scale[EB/OL]. (2020-10-22) [2024-12-23]. https://arxiv.org/abs/2010.11929.

[31]

Al Q A, Almekkawy M. Improved UNet with attention for medical image segmentation. Sensors, 2023, 23(208589 J]

[32]

Khurshid M, Akhter Y, Vatsa M, et al.. Assist-Distil for medical image segmentation. Biomedical signal processing and control, 2024, 97: 106568 J]

[33]

Shamim S, Awan M J, Mohd Zain A, et al.. Automatic COVID-19 lung infection segmentation through modified UNet model. Journal of healthcare engineering, 2022, 2022(16566982[J]

[34]

Zhang Y, Liao Q, Yuan L, et al.. Exploiting shared knowledge from non-COVID lesions for annotation-efficient COVID-19 CT lung infection segmentation. IEEE journal of biomedical and health informatics, 2021, 25(114152-4162 J]

[35]

Chen M, Yi S, Yang M, et al.. UNet segmentation network of COVID-19 CT images with multi-scale attention. Mathematical biosciences and engineering, 2023, 20: 16762-16785 J]

RIGHTS & PERMISSIONS

Tianjin University of Technology

PDF

25

Accesses

0

Citation

Detail

Sections
Recommended

/