Title :
Cartesian K-Means
Author :
Norouzi, Mohammad ; Fleet, David J.
Author_Institution :
Dept. of Comput. Sci., Univ. of Toronto, Toronto, ON, Canada
Abstract :
A fundamental limitation of quantization techniques like the k-means clustering algorithm is the storage and run-time cost associated with the large numbers of clusters required to keep quantization errors small and model fidelity high. We develop new models with a compositional parameterization of cluster centers, so representational capacity increases super-linearly in the number of parameters. This allows one to effectively quantize data using billions or trillions of centers. We formulate two such models, Orthogonal k-means and Cartesian k-means. They are closely related to one another, to k-means, to methods for binary hash function optimization like ITQ (Gong and Lazebnik, 2011), and to Product Quantization for vector quantization (Jegou et al., 2011). The models are tested on large-scale ANN retrieval tasks (1M GIST, 1B SIFT features), and on codebook learning for object recognition (CIFAR-10).
Keywords :
feature extraction; image representation; learning (artificial intelligence); neural nets; object recognition; pattern clustering; transforms; vector quantisation; 1B SIFT features; 1M GIST features; CIFAR-10; Cartesian k-means; ITQ; binary hash function optimization; cluster center; codebook learning; compositional parameterization; data quantization; k-means clustering algorithm; large-scale ANN retrieval task; object recognition; orthogonal k-means; product quantization; quantization error; quantization techniques; representational capacity; vector quantization; Artificial neural networks; Encoding; Hamming distance; Indexes; Optimization; Quantization (signal); Vectors; approximate nearest neighbor search; bag of words; cartesian; codebook; euclidean nearest neighbor search; hashing; large-scale; learning; nearest neighbor search; product quantization; quantization; retrieval;
Conference_Titel :
Computer Vision and Pattern Recognition (CVPR), 2013 IEEE Conference on
Conference_Location :
Portland, OR
DOI :
10.1109/CVPR.2013.388