DocumentCode :
1117849
Title :
Minimum Cross-Entropy Pattern Classification and Cluster Analysis
Author :
Shore, John E. ; Gray, Robert M.
Author_Institution :
SENIOR MEMBER, IEEE, Information Technology Division, Naval Research Laboratory, Washington, DC 20375.
Issue :
1
fYear :
1982
Firstpage :
11
Lastpage :
17
Abstract :
This paper considers the problem of classifying an input vector of measurements by a nearest neighbor rule applied to a fixed set of vectors. The fixed vectors are sometimes called characteristic feature vectors, codewords, cluster centers, models, reproductions, etc. The nearest neighbor rule considered uses a non-Euclidean information-theoretic distortion measure that is not a metric, but that nevertheless leads to a classification method that is optimal in a well-defined sense and is also computationally attractive. Furthermore, the distortion measure results in a simple method of computing cluster centroids. Our approach is based on the minimization of cross-entropy (also called discrimination information, directed divergence, K-L number), and can be viewed as a refinement of a general classification method due to Kullback. The refinement exploits special properties of cross-entropy that hold when the probability densities involved happen to be minimum cross-entropy densities. The approach is a generalization of a recently developed speech coding technique called speech coding by vector quantization.
Keywords :
Distortion measurement; Information analysis; Minimization methods; Nearest neighbor searches; Pattern analysis; Pattern classification; Probability density function; Speech analysis; Speech coding; Vector quantization; Cluster analysis; cross-entropy; discrimination information; pattern analysis;
fLanguage :
English
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
Publisher :
ieee
ISSN :
0162-8828
Type :
jour
DOI :
10.1109/TPAMI.1982.4767189
Filename :
4767189
Link To Document :
بازگشت