Unsupervised Deep Learning Meets Chan-Vese Model

Dihan Zheng , Chenglong Bao , Zuoqiang Shi , Haibin Ling , Kaisheng Ma

CSIAM Trans. Appl. Math. ›› 2022, Vol. 3 ›› Issue (4) : 662 -691.

PDF (66KB)
CSIAM Trans. Appl. Math. ›› 2022, Vol. 3 ›› Issue (4) : 662 -691. DOI: 10.4208/csiam-am.SO-2021-0049
research-article

Unsupervised Deep Learning Meets Chan-Vese Model

Author information +
History +
PDF (66KB)

Abstract

The Chan-Vese (CV) model is a classic region-based method in image segmentation. However, its piecewise constant assumption does not always hold for practical applications. Many improvements have been proposed but the issue is still far from well solved. In this work, we propose an unsupervised image segmentation approach that integrates the CV model with deep neural networks, which significantly improves the original CV model’s segmentation accuracy. Our basic idea is to apply a deep neural network that maps the image into a latent space to alleviate the violation of the piecewise constant assumption in image space. We formulate this idea under the classic Bayesian framework by approximating the likelihood with an evidence lower bound (ELBO) term while keeping the prior term in the CV model. Thus, our model only needs the input image itself and does not require pre-training from external datasets. Moreover, we extend the idea to multi-phase case and dataset based unsupervised image segmentation. Extensive experiments validate the effectiveness of our model and show that the proposed method is noticeably better than other unsupervised segmentation approaches.

Keywords

Image segmentation / Chan-Vese model / variational inference / unsupervised learning

Cite this article

Download citation ▾
Dihan Zheng, Chenglong Bao, Zuoqiang Shi, Haibin Ling, Kaisheng Ma. Unsupervised Deep Learning Meets Chan-Vese Model. CSIAM Trans. Appl. Math., 2022, 3(4): 662-691 DOI:10.4208/csiam-am.SO-2021-0049

登录浏览全文

4963

注册一个新账户 忘记密码

References

AI Summary AI Mindmap
PDF (66KB)

136

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/