Abstract :
Bayesian methods are used to analyse the problem of training a model to make predictions about the distribution of data that has yet to be received. Mixture distributions emerge naturally from this framework, but are not well-matched to high-dimensional problems such as arise image processing applications. An extension to partitioned mixture distributions (PMD) is presented, which is essentially a set of overlapping mixture distributions, and an expectation-maximisation training algorithm is derived. Finally, the results of some numerical simulations are presented, which demonstrate that lateral inhibition arises naturally in PMDs, and that the nodes in a PMD co-operate in such a way that each mixture distribution receives a full complement of what is needed for it to compute a mixture distribution