Title :
Solving parity-N problems with feedforward neural networks
Author :
Wilamowski, Bodgan M. ; Hunter, David ; Malinowski, Aleksander
Author_Institution :
Boise Graduate Center, Idaho Univ., Moscow, ID, USA
Abstract :
Several neural network architectures for computing parity problems are described. Feedforward networks with one hidden layer require N neurons in the hidden layer. If fully connected feedforward networks are considered, the number of neurons in the hidden layer is reduced to N/2. In the case of fully connected networks with neurons connected in cascade, the minimum number of hidden neurons is between log2(N+1)-1 and log2(N+1). This paper also describes hybrid neuron architectures with linear and threshold-like activation functions. These hybrid architectures require the least number of weights. The described architectures are suitable for hardware implementation since the majority of weights equal +1 and weight multiplication is not required. The simplest network structures are pipeline architectures where all neurons and their weights are identical. All presented architectures and equations were verified with MATLAB code for parity-N problems as large as N=100.
Keywords :
feedforward neural nets; neural net architecture; MATLAB code; fully connected feedforward networks; hardware implementation; hidden layer; hybrid neuron architectures; linear activation functions; neural network architectures; parity-n problems; pipeline architectures; threshold-like activation functions; Circuits; Computer architecture; Computer networks; Digital systems; Equations; Feedforward neural networks; Hardware; Neural networks; Neurons; Pipelines;
Conference_Titel :
Neural Networks, 2003. Proceedings of the International Joint Conference on
Print_ISBN :
0-7803-7898-9
DOI :
10.1109/IJCNN.2003.1223966