Title :
Latent Task Adaptation with Large-Scale Hierarchies
Author :
Yangqing Jia ; Darrell, Trevor
Abstract :
Recent years have witnessed the success of large-scale image classification systems that are able to identify objects among thousands of possible labels. However, it is yet unclear how general classifiers such as ones trained on Image Net can be optimally adapted to specific tasks, each of which only covers a semantically related subset of all the objects in the world. It is inefficient and sub optimal to retrain classifiers whenever a new task is given, and is inapplicable when tasks are not given explicitly, but implicitly specified as a set of image queries. In this paper we propose a novel probabilistic model that jointly identifies the underlying task and performs prediction with a linear-time probabilistic inference algorithm, given a set of query images from a latent task. We present efficient ways to estimate parameters for the model, and an open-source toolbox to train classifiers distributedly at a large scale. Empirical results based on the Image Net data showed significant performance increase over several baseline algorithms.
Keywords :
image classification; parameter estimation; probability; ImageNet; baseline algorithms; general classifiers; large-scale hierarchies; large-scale image classification systems; latent task adaptation; linear-time probabilistic inference algorithm; open-source toolbox; parameter estimation; probabilistic model; query images; semantically related subset; Accuracy; Adaptation models; Context; Probabilistic logic; Psychology; Testing; Training; image classification; large scale; object recognition; optimization;
Conference_Titel :
Computer Vision (ICCV), 2013 IEEE International Conference on
Conference_Location :
Sydney, NSW
DOI :
10.1109/ICCV.2013.260