Title :
Global convergence and empirical consistency of the generalized Lloyd algorithm
Author :
Sabin, Michael J. ; Gray, Robert M.
fDate :
3/1/1986 12:00:00 AM
Abstract :
The generalized Lloyd algorithm for vector quantizer design is analyzed as a descent algorithm for nonlinear programming. A broad class of convex distortion functions is considered and any input distribution that has no singular-continuous part is allowed. A well-known convergence theorem is applied to show that iterative applications of the algorithm produce a sequence of quantizers that approaches the set of fixed-point quantizers. The methods of the theorem are extended to sequences of algorithms, yielding results on the behavior of the algorithm when an unknown distribution is approximated by a training sequence of observations. It is shown that as the length of the training sequence grows large that 1) fixed-point quantizers for the training sequence approach the set of fixed-point quantizers for the true distribution, and 2) limiting quantizers produced by the algorithm with the training sequence distribution perform no worse than limiting quantizers produced by the algorithm with the true distribution.
Keywords :
Nonlinear programming; Quantization; Algorithm design and analysis; Books; Convergence; Cost function; Distribution functions; Iterative algorithms; Iterative methods; Nearest neighbor searches; Nonlinear distortion; Partitioning algorithms;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.1986.1057168