Title :
Information Rate Maximization over a Resistive Grid
Author_Institution :
Univ. of California at Berkeley, Berkeley
Abstract :
The work presents the first results of the authors research on adaptive cellular neural networks (CNN) based on a global information theoretic cost-function. It considers the simplest case of optimizing a resistive grid such that the Shannon information rate across the input-output boundaries of the grid is maximized. Besides its importance in information theory, information rate has been proven to be a useful concept for principal as well independent component analysis (PCA, ICA). In contrast to linear fully connected neural networks, resistive grids due to their local coupling can resemble models of physical media and are feasible for a VLSI implementation. Results for spatially invariant as well as for the spatially variant case are presented and their relation to principal subspace analysis (PSA) is outlined. Simulation results show the validity of the proposed results.
Keywords :
cellular neural nets; independent component analysis; information theory; optimisation; principal component analysis; Shannon information rate; VLSI implementation; adaptive cellular neural networks; independent component analysis; information rate maximization; information theoretic cost-function; input-output boundary; local coupling; principal component analysis; principal subspace analysis; resistive grid; Artificial intelligence; Artificial neural networks; Cellular neural networks; Independent component analysis; Information rates; Information theory; Lattices; Neural networks; Principal component analysis; Very large scale integration;
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
DOI :
10.1109/IJCNN.2006.247272