Query by diverse committee in transfer active learning

Hao SHAO

PDF(620 KB)
PDF(620 KB)
Front. Comput. Sci. ›› 2019, Vol. 13 ›› Issue (2) : 280-291. DOI: 10.1007/s11704-017-6117-6
RESEARCH ARTICLE

Query by diverse committee in transfer active learning

Author information +
History +

Abstract

Transfer active learning, which is an emerging learning paradigm, aims to actively select informative instances with the aid of transferred knowledge from related tasks. Recently, several studies have addressed this problem. However, how to handle the distributional differences between the source and target domains remains an open problem. In this paper, a novel transfer active learning algorithm is proposed, inspired by the classical query by committee algorithm. Diverse committee members from both domains are maintained to improve the classification accuracy and a mechanism is included to evaluate each member during the iterations. Extensive experiments on both synthetic and real datasets show that our algorithm performs better and is also more robust than the state-of-the-art methods.

Keywords

transfer learning / active learning / machine learning

Cite this article

Download citation ▾
Hao SHAO. Query by diverse committee in transfer active learning. Front. Comput. Sci., 2019, 13(2): 280‒291 https://doi.org/10.1007/s11704-017-6117-6

References

[1]
Settles B. Active learning literature survey. Technical Report No. 1648, 2010
[2]
Rosenstein M T, Z. Marx L P K, Dietterich T G. To transfer or not to transfer. In: Proceedings of NIPS Workshop on Transfer Learning. 2005
[3]
Houlsby N, Lobato J M H, Ghahramani Z. Cold-start active learning with robust ordinal matrix factorization. In: Proceedings of the 31st International Conference of Machine Learning. 2014, 766–774
[4]
Shao H, Tong B, Suzuki E. Query by committee in a heterogeneous environment. In: Proceedings of the 8th International Conference on Advanced Data Mining and Applications. 2012, 186–198
CrossRef Google scholar
[5]
Kale D, Liu Y. Accelerating active learning with transfer learning. In: Proceedings of the 13th IEEE International Conference on Data Mining. 2013, 1085–1090
CrossRef Google scholar
[6]
Chattopadhyay R, Fan W, Davidson I, Panchanathan S, Ye J. Joint transfer and batch-mode active learning. In: Proceedings of the 30th International Conference on Machine Learning. 2013, 253–261
[7]
Zhu Z, Zhu X, Ye Y, Guo Y F, Xue X. Transfer active learning. In: Proceedings of the 20th International Conference on Information and Knowledge Management. 2011, 2169–2172
CrossRef Google scholar
[8]
Rai P, Saha A, Daumé III H, Venkatasubramanian S. Domain adaptation meets active learning. In: Proceedings of the NAACL HLT Workshop on Active Learning for Natural Language Processing. 2010, 27–32
[9]
Fang M, Yin J, Zhu X. Knowledge transfer for multi-labeler active learning. In: Proceedings of Joint European Conference on Machine Learning and Knowledge Discovery in Databases. 2013, 273–288
CrossRef Google scholar
[10]
Li H, Shi Y, Chen M, Hauptmann A G, Xiong Z. Hybrid active learning for cross-domain video concept detection. In: Proceedings of the 18th ACM International Conference on Multimedia. 2010, 1003–1006
CrossRef Google scholar
[11]
Shi X, Fan W, Ren J. Actively transfer domain knowledge. In: Proceedings of Joint European Conference on Machine Learning and Knowledge Discovery in Databases. 2008, 342–357
CrossRef Google scholar
[12]
Luo C, Ji Y, Dai X, Chen J. Active learning with transfer learning. In: Proceedings of ACL Student Research Workshop. 2012, 13–18
[13]
Yang L, Hanneke S, Carbonell J. A theory of transfer learning with applications to active learning. Maching Learning, 2013, 90(2): 161–189
CrossRef Google scholar
[14]
Caruana R. Multitask learning. In: Thrun S, Pratt L, eds. Leaning to Learn. Springer US, 1998, 95–133
CrossRef Google scholar
[15]
Shao H, Suzuki E. Feature-based inductive transfer learning through minimum encoding. In: Proceedings of the SIAM International Conference on Data Mining. 2011, 259–270
CrossRef Google scholar
[16]
Reichart R, Tomanek K, Hahn U, Rappoport A. Multi-task active learning for linguistic annotations. In: Proceedings of Annual Meeting of the Association for Computational Linguistics. 2008, 861–869
[17]
Raj S, Ghosh J, Crawford M M. An active learning approach to knowledge transfer for hyperspectral data analysis. In: Proceedings of IEEEInternational Conference on Geoscience and Remote Sensing Symposium. 2006, 541–544
[18]
Roy N, Mccallum A. Toward optimal active learning through sampling estimation of error reduction. In: Proceedings of the 18th International Conference on Machine Learning. 2011, 441–448
[19]
Huang S J, Chen S. Transfer learning with active queries from source domain. In: Proceedings of the 25th International Joint Conference on Artificial Intelligence. 2016, 1592–1598
[20]
Gao N, Huang S J, Chen S. Multi-label active learning by model guided distribution matching. Frontiers of Computer Science, 2016, 10(5): 845–855
CrossRef Google scholar
[21]
Wallace C, Patrick J. Coding decision trees. Journal of Machine Learning, 1993, 11(1): 7–22
CrossRef Google scholar
[22]
Quinlan J R, Rivest R L. Inferring decision trees using the minimumdescription length principle. Information and Computation, 1989, 80(3): 227–248
CrossRef Google scholar
[23]
Shannon C E. A mathematical theory of communication. Bell System Technical Journal, 1948, 27: 379–423
CrossRef Google scholar
[24]
Dagan I, Engelson S P. Committee-based sampling for training probabilistic classifiers. In: Proceedings of the 23rd International Conference on Machine Learning. 2006, 150–157
[25]
Lewis D D, Gale W A. A sequential algorithm for training text classifiers. In: Proceedings of the 17th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. 1994, 3–12
CrossRef Google scholar
[26]
Krause A, Guestrin C. Optimal value of information in graphical models. Journal of Artificial Intelligence, 2009, 35: 557–591
[27]
Zhang Y. Multi-task active learning with output constraints. In: Proceedings of the 24th AAAI Conference on Artificial Intelligence. 2010
[28]
Seung H S, Opper M, Sompolinsky H. Query by committee. In: Proceedings of the 5th Annud workshop on Computational Learning Theory. 1992, 287–294
[29]
McCallum A, Nigam K. Employing em in pool-based active learning for text classification. In: Proceedings of the 15th International Conference of Machine Learning. 1998, 350–358
[30]
Balcan M F, Beygelzimer A, Langford J. Agnostic active learning. In: Proceedings of the 23rd International Conference on Machine Learning. 2006, 65–72
CrossRef Google scholar
[31]
Chang C C, Lin C J. LIBSVM: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2001, 2(3): 1–27
CrossRef Google scholar
[32]
Dai W,Yang Q, Xue G R, Yu Y. Boosting for transfer learning. In: Proceedings of the 24th International Conference of Machine Learning. 2007, 193–200
CrossRef Google scholar
[33]
Shi Y, Lan Z, Liu W, Bi W. Extending semi-supervised learning methods for inductive transfer learning. In: Proceedings of IEEE International Conference on Data Mining. 2009, 483–492
CrossRef Google scholar

RIGHTS & PERMISSIONS

2018 Higher Education Press and Springer-Verlag GmbH Germany, part of Springer Nature
AI Summary AI Mindmap
PDF(620 KB)

Accesses

Citations

Detail

Sections
Recommended

/