Title :
Batch Mode Sparse Active Learning
Author :
Shi, Lixin ; Zhao, Yuhang
Author_Institution :
Inst. for Theor. Comput. Sci., Tsinghua Univ., Beijing, China
Abstract :
Sparse representation, due to its clear and powerful insight deep into the structure of data, has seen a recent surge of interest in the classification community. Based on this, a family of reliable classification methods have been proposed. On the other hand, obtaining sufficiently labeled training data has long been a challenging problem, thus considerable research has been done regarding active selection of instances to be labeled. In our work, we will present a novel unified framework, i.e. BMSAL(Batch Mode Sparse Active Learning). Based on the existing sparse family of classifiers, we define rigorously the corresponding BMSAL family and explore their shared properties, most importantly (approximate) sub modularity. We focus on the feasibility and reliability of the BMSAL family: The first one inspires us to optimize the algorithms and conduct experiments comparing with state-of-the-art methods, for reliability, we give error-bounded algorithms, as well as detailed logical deductions and empirical tests for applying sparse in non-linear data sets.
Keywords :
learning (artificial intelligence); batch mode; classification methods; sparse active learning; sparse representation; submodularity; active learning; batch mode sparse active learning; sparse classification; submodularity;
Conference_Titel :
Data Mining Workshops (ICDMW), 2010 IEEE International Conference on
Conference_Location :
Sydney, NSW
Print_ISBN :
978-1-4244-9244-2
Electronic_ISBN :
978-0-7695-4257-7
DOI :
10.1109/ICDMW.2010.175