Active improvement of hierarchical object features under budget constraints

Nicolas CEBRON

PDF(657 KB)
PDF(657 KB)
Front. Comput. Sci. ›› 2012, Vol. 6 ›› Issue (2) : 143-153. DOI: 10.1007/s11704-012-2857-5
RESEARCH ARTICLE

Active improvement of hierarchical object features under budget constraints

Author information +
History +

Abstract

When we think of an object in a supervised learning setting, we usually perceive it as a collection of fixed attribute values. Although this setting may be suited well for many classification tasks, we propose a new object representation and therewith a new challenge in data mining; an object is no longer described by one set of attributes but is represented in a hierarchy of attribute sets in different levels of quality. Obtaining a more detailed representation of an object comes with a cost. This raises the interesting question of which objects we want to enhance under a given budget and cost model. This new setting is very useful whenever resources like computing power, memory or time are limited. We propose a new active adaptive algorithm (AAA) to improve objects in an iterative fashion. We demonstrate how to create a hierarchical object representation and prove the effectiveness of our new selection algorithm on these datasets.

Keywords

object hierarchy / machine learning / active learning

Cite this article

Download citation ▾
Nicolas CEBRON. Active improvement of hierarchical object features under budget constraints. Front Comput Sci, 2012, 6(2): 143‒153 https://doi.org/10.1007/s11704-012-2857-5

References

[1]
Rueping S, Scheffer T. Proceedings of the ICML 2005 Workshop on Learning with Multiple Views. 2005
[2]
Adelson E H, Anderson C H, Bergen J R, Burt P J, Ogden J M. Pyramid methods in image processing. RCA Engineer, 1984, 29(6): 33-41
[3]
Cohn D A, Atlas L, Ladner R E. Improving generalization with active learning. Machine Learning, 1994, 15(2): 201-221
CrossRef Google scholar
[4]
Zhou Z H, Li M. Semi-supervised learning by disagreement. Knowledge and Information Systems, 2010, 24(3): 415-439
CrossRef Google scholar
[5]
MacKay D J C. Information-based objective functions for active data selection. Neural Computation, 1992, 4(4): 590-604
CrossRef Google scholar
[6]
Roy N, McCallum A. Toward optimal active learning through sampling estimation of error reduction. In: Proceedings of the 18th International Conference on Machine Learning. 2001, 441-448
[7]
Cohn D A, Ghahramani Z, Jordan M I. Active learning with statistical models. In: Proceedings of 1994 Neural Information Processing Systems. 1994, 705-712
[8]
Lindenbaum M, Markovitch S, Rusakov D. Selective sampling for nearest neighbor classifiers. Machine Learning, 2004, 54(2): 125-152
CrossRef Google scholar
[9]
Freund Y, Seung S H, Shamir E, Tishby N. Selective sampling using the query by committee algorithm. Machine Learning, 1997, 28(2-3): 133-168
CrossRef Google scholar
[10]
Tong S, Koller D. Support vector machine active learning with applications to text classification. Journal of Machine Learning Research, 2001, 2: 45-66
[11]
Schohn G, Cohn D. Less is more: active learning with support vector machines. In: Proceedings of the 17th International Conference on Machine Learning. 2000, 839-846
[12]
Campbell C, Cristianini N, Smola A J. Query learning with large margin classifiers. In: Proceedings of the 17th International Conference on Machine Learning. 2000, 111-118
[13]
Baram Y, El-Yaniv R, Luz K. Online choice of active learning algorithms. Journal of Machine Learning Research, 2004, 5: 255-291
[14]
Osugi T, Kun D, Scott S. Balancing exploration and exploitation: a new algorithm for active machine learning. In:Proceedings of the 5th IEEE International Conference on Data Mining. 2005, 330-337
CrossRef Google scholar
[15]
Cebron N, Berthold M R. Active learning for object classification: from exploration to exploitation. Data Mining and Knowledge Discovery, 2009, 18(2): 283-299
[16]
Balcan M, Beygelzimer A, Langford J. Agnostic active learning. In: Proceedings of the 23rd International Conference on Machine Learning. 2006, 65-72
[17]
Dasgupta S, Kalai A T, Monteleoni C. Analysis of perceptron-based active learning. Journal ofMachine Learning Research, 2009, 10: 281-299
[18]
Zhao W, He Q, Ma H, Shi Z. Effective semi-supervised document clustering via active learning with instance-level constraints. Knowledge and Information Systems (in Press)
[19]
Basu S, Banerjee A, Mooney R J. Active semi-supervision for pairwise constrained clustering. In: Proceedings of the 4th SIAM International Conference on Data Mining. 2004, 333-344
[20]
Kapoor A, Horvitz E, Basu S. Selective supervision: guiding supervised learning with decision-theoretic active learning. In: Proceedings of the 20th International Joint Conference on Artificial Intelligence. 2007, 877-882
[21]
Settles B, Craven M, Friedland L. Active learning with real annotation costs. In: Proceedings of the NIPS Workshop on Cost-Sensitive Learning. 2008, 1-10
[22]
Zheng Z, Padmanabhan B. On active learning for data acquisition. In: Proceedings of 2002 IEEE International Conference on Data Mining. 2002, 562-569
[23]
Saar-Tsechansky M, Melville P, Provost F. Active feature-value acquisition. Management Science, 2009, 55(4): 664-684
[24]
Viola P A, Jones M J. Rapid object detection using a boosted cascade of simple features. In: Proceedings of 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 2001, 511-518
[25]
Schölkopf B, Burges C J C, Smola A J. Advances in Kernel Methods: Support Vector Learning. Cambridge: MIT Press, 1999
[26]
Abbasnejad M E, Ramachandram D, Mandava R. A survey of the state of the art in learning the kernels. Knowledge and Information Systems (in Press)
[27]
McCallum A, Nigam K. Employing EMand pool-based active learning for text classification. In: Proceedings of the 15th International Conference on Machine Learning. 1998, 350-358
[28]
Mandel M I, Poliner G E, Ellis D P W. Support vector machine active learning for music retrieval. Multimedia Systems, 2006, 12(1): 3-13
[29]
Wang L, Chan K L, Zhang Z. Bootstrapping SVM active learning by incorporating unlabelled images for image retrieval. In: Proceedings of 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 2003, 629-634
[30]
Luo T, Kramer K, Goldgof D B, Hall L O, Samson S, Remsen A, Hopkins T. Active learning to recognize multiple types of plankton. Journal of Machine Learning Research, 2005, 6: 589-613
[31]
Warmuth M K, Liao J, Rätsch G, Mathieson M, Putta S, Lemmen C. Active learning with support vector machines in the drug discovery process. Journal of Chemical Information and Computer Sciences, 2003, 43(2): 667-673
[32]
Cauwenberghs G, Poggio T. Incremental and decremental support vector machine learning. In: Proceedings of 2000 Neural Information Processing Systems. 2000, 409-415
[33]
van der Heijden F, Duin R, de Ridder D, Tax D M J. Classification, Parameter Estimation and State Estimation: An Engineering Approach Using Matlab. New York: Wiley, 2004
[34]
Zernike F. Diffraction theory of the cut procedure and its improved form, the phase contrast method. Physica, 1934, 1: 689-704
[35]
Asuncion A, Newman D J. UCI Machine Learning Repository, 2007

RIGHTS & PERMISSIONS

2014 Higher Education Press and Springer-Verlag Berlin Heidelberg
AI Summary AI Mindmap
PDF(657 KB)

Accesses

Citations

Detail

Sections
Recommended

/