DocumentCode :
3251890
Title :
Feature discovery under contextual supervision using mutual information
Author :
Kay, Jim
Author_Institution :
Dept. of Math. & Stat., Stirling Univ., UK
Volume :
4
fYear :
1992
fDate :
7-11 Jun 1992
Firstpage :
79
Abstract :
The author considers a neural network in which the inputs may be divided into two groups, termed primary inputs and contextual inputs. The goal of the network is to discover those linear functions of the primary inputs that are maximally related to the information contained in the contextual units. The strength of the relationship between the two sets of inputs is measured by using their average mutual information. In the situation where the inputs follow a multivariate, elliptically symmetric probability model, this is equivalent to performing a canonical correlation analysis. A stochastic algorithm is introduced to achieve this analysis. Some theoretical details including a convergence results are presented. Some possible nonlinear extensions are discussed
Keywords :
learning (artificial intelligence); neural nets; contextual supervision; feature discovery; neural network; stochastic algorithm; symmetric probability model; Biological neural networks; Convergence; Information analysis; Mathematics; Mutual information; Performance analysis; Performance evaluation; Statistics; Stochastic processes; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
Type :
conf
DOI :
10.1109/IJCNN.1992.227286
Filename :
227286
Link To Document :
بازگشت