DocumentCode :
1254230
Title :
Capacity of two-layer feedforward neural networks with binary weights
Author :
Ji, Chuanyi ; Psaltis, Demetri
Author_Institution :
Dept. of Electr. Comput. & Syst. Eng., Rensselaer Polytech. Inst., Troy, NY, USA
Volume :
44
Issue :
1
fYear :
1998
fDate :
1/1/1998 12:00:00 AM
Firstpage :
256
Lastpage :
268
Abstract :
The lower and upper bounds for the information capacity of two-layer feedforward neural networks with binary interconnections, integer thresholds for the hidden units, and zero threshold for the output unit is obtained through two steps. First, through a constructive approach based on statistical analysis, it is shown that a specifically constructed (N-2L-1) network with N input units, 2L hidden units, and one output unit is capable of implementing, with almost probability one, any dichotomy of O(W/ln W) random samples drawn from some continuous distributions, where W is the total number of weights of the network. This quantity is then used as a lower bound for the information capacity C of all (N-2L-1) networks with binary weights. Second, an upper bound is obtained and shown to be O(W) by a simple counting argument. Therefore, we have Ω(W/ln W)⩽C⩽O(W)
Keywords :
channel capacity; feedforward neural nets; multilayer perceptrons; probability; random processes; signal sampling; statistical analysis; binary interconnections; binary weights; continuous distributions; counting; hidden units; information capacity; integer thresholds; lower bound; neural network capacity; output unit; probability; random samples; statistical analysis; two-layer feedforward neural networks; upper bound; zero threshold; Capacity planning; Feedforward neural networks; Hardware; Helium; Multi-layer neural network; Neural networks; Nonhomogeneous media; Probability; Statistical analysis; Upper bound;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.651033
Filename :
651033
Link To Document :
بازگشت