An improved parallel thinning algorithm with two subiterations

Fen Zhang , Yun-shan Wang , Cheng-yong Gao , Shu-chun Si , Jian-qiang Xu

Optoelectronics Letters ›› 2008, Vol. 4 ›› Issue (1) : 69 -71.

PDF
Optoelectronics Letters ›› 2008, Vol. 4 ›› Issue (1) : 69 -71. DOI: 10.1007/s11801-008-7108-5
Article

An improved parallel thinning algorithm with two subiterations

Author information +
History +
PDF

Abstract

The parallel thinning algorithm with two subiterations is improved in this paper. By analyzing the notions of connected components and passes, a conclusion is drawn that the number of passes and the number of eight-connected components are equal. Then the expression of the number of eight-connected components is obtained which replaces the old one in the algorithm. And a reserving condition is proposed by experiments, which alleviates the excess deletion where a diagonal line and a beeline intersect. The experimental results demonstrate that the thinned curve is almost located in the middle of the original curve connectively with single pixel width and the processing speed is high.

Cite this article

Download citation ▾
Fen Zhang, Yun-shan Wang, Cheng-yong Gao, Shu-chun Si, Jian-qiang Xu. An improved parallel thinning algorithm with two subiterations. Optoelectronics Letters, 2008, 4(1): 69-71 DOI:10.1007/s11801-008-7108-5

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

ZhangT.Y., SuenC.Y.. Commun. ACM, 1984, 27: 236

[2]

KwonJ.-S., GiJ.-W., KangE.-K.. Proceedings.2001 International Conference on Image Processing, 2001, 3: 752

[3]

GuoZ., HallR. W.. Commun. ACM, 1989, 32: 359

[4]

ZhangY.Y., WangP.S.P.. Proc. IEEE Int’l Conf. Pattern Recognition, 1996, 4: 457

[5]

H.E., WangP.S.P.. Commun. ACM, 1986, 29: 239

[6]

LL., SuenC.Y.. IEEE Trans. PAML, 1995, 17: 914

[7]

AhmedM., WardR.. IEEE Transaction on Pattern Analysis and Machine Intelligence, 2002, 24: 1672

[8]

RockettP. I.. IEEE Transaction on Pattern Analysis and Machine Intelligence, 2005, 27: 1671

AI Summary AI Mindmap
PDF

121

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/