Two-way Markov random walk transductive learning algorithm

Hong Li , Xiao-yan Lu , Wei-wen Liu , Clement K. Kirui

Journal of Central South University ›› 2014, Vol. 21 ›› Issue (3) : 970 -977.

PDF
Journal of Central South University ›› 2014, Vol. 21 ›› Issue (3) : 970 -977. DOI: 10.1007/s11771-014-2026-0
Article

Two-way Markov random walk transductive learning algorithm

Author information +
History +
PDF

Abstract

Researchers face many class prediction challenges stemming from a small size of training data vis-a-vis a large number of unlabeled samples to be predicted. Transductive learning is proposed to utilize information about unlabeled data to estimate labels of the unlabeled data for this condition. This work presents a new transductive learning method called two-way Markov random walk (TMRW) algorithm. The algorithm uses information about labeled and unlabeled data to predict the labels of the unlabeled data by taking random walks between the labeled and unlabeled data where data points are viewed as nodes of a graph. The labeled points correlate to unlabeled points and vice versa according to a transition probability matrix. We can get the predicted labels of unlabeled samples by combining the results of the two-way walks. Finally, ensemble learning is combined with transductive learning, and Adboost.MH is taken as the study framework to improve the performance of TMRW, which is the basic learner. Experiments show that this algorithm can predict labels of unlabeled data well.

Keywords

classification / transductive learning / two-way Markov random walk (TMRW) / Adboost.MH

Cite this article

Download citation ▾
Hong Li, Xiao-yan Lu, Wei-wen Liu, Clement K. Kirui. Two-way Markov random walk transductive learning algorithm. Journal of Central South University, 2014, 21(3): 970-977 DOI:10.1007/s11771-014-2026-0

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

QuinlanJ RC4.5: Programs for machine learning [M], 1993, San Mateo, Morgan Kaufmann Publishers: 121-145

[2]

SchapireR E, SingerY. Improved boosting algorithms using confidence-rated predictions [J]. Machine Learning, 1999, 37(3): 297-336

[3]

ZhangM-l, ZhouZ-hua. A lazy learning approach to multi-label learning[J]. Pattern Recognition archive, 2007, 40(7): 2038-2048

[4]

VapnikVThe nature of statistical learning theory [M], 1995, New York, Springer-Verlag: 79-92

[5]

JoachimsT. Transductive inference for text classification using vector machine [C]. Proc of the 16th International Conference on Machine Learning. Bled, Slovenia, 1999200-209

[6]

LiuY, JinR, LiuYang. Semi-supervised multi-label learning by constrained non-negative matrix factorization [C]. Proc of the 21st National Conf on Artificial Intelligence (AAAI’06), 2006421-426

[7]

ThorstenJ. Transductive learning via spectral graph partitioning [C]. Twentieth International Conference on Machine Learning, v1, Ithaca, NY 14853, United States, 2003290-297

[8]

ARIK AZRAN.. The rendezvous algorithm: Multi-class semi-supervised learning with markov random walks [C]. Twenty-Fourth International Conference on Machine Learning, United Kingdom, 200749-56

[9]

DONATO MALERBA, MICHELANGELO CECI, ANNALISA APPICE.. A relational approach to probabilistic classification in a transductive setting [J]. Engineering Applications of Artificial Intelligence, 2009, 22(1): 109-116

[10]

SchapireR E. The strength of weak learnability [J]. Machine Learning, 1990, 5(2): 97-227

[11]

FreundY, SchapireR E. A decision-theoretic generalization of on-line learning and an application to boosting [J]. Journal of Computer and System Science, 1997, 55(1): 119-139

[12]

ZhouZ-hua. When semi-supervised learning meets ensemble learning [C]. Proceedings of the 8th International Workshop on Multiple Classifier Systems, Nanjing, China. MCS, 2009529-538

[13]

QuY, SuH-y, GuoL-c, ChuJian. A novel SVM modeling approach for highly imbalanced and overlapping classification [J]. Intelligent Data Analysis, 2011, 15(3): 319-341

[14]

ZhaoY-f, ZhaoY, ZhuZ-feng. TSVM-HMM: Transductive SVM based hidden Markov model for automatic image annotation [J]. Expert Systems with Applications, 2009, 36(6): 9813-9818

[15]

Gal-OrM, MayJ H, SpanglerW E. When to choose an ensemble classifier model for data mining [J]. International Journal of Business Intelligence and Data Mining, 2010, 5(3): 297-318

AI Summary AI Mindmap
PDF

105

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/