Title :
Neural networks: binary monotonic and multiple-valued
Author :
Zurada, Jacek M.
Author_Institution :
Dept. of Electr. & Comput. Eng., Louisville Univ., KY, USA
Abstract :
This paper demonstrates how conventional neural networks can be modified, extended or generalized by introducing basic notions of multiple-valued logic to the definition of neurons. It has been shown that multilevel neurons produce useful attractor-type neural networks and lead to multistable memory cells. This opens up a possibility of storing a multiplicity of logic levels in a “generalized” Hopfield memory. Another interesting attractor-type network encodes information in complex output values of the neurons, and specifically, in their phase angles. This network working as a memory is able to recognize many stored grey-level values as output of a single neuron. As such, this network represents an extension of bivalent information processors. Multilevel neurons can also be employed in perceptron type classifiers trained with the error backpropagation algorithm. This offers the advantage that the resulting networks are smaller, with fewer weights and neurons to perform typical classification tasks. This improvement is achieved at a cost of considerable enhancement to the neurons´ activation functions
Keywords :
multivalued logic; neural nets; attractor-type network; attractor-type neural networks; error backpropagation; multiple-valued logic; multistable memory cells; neural networks; Artificial neural networks; Biological neural networks; Biological system modeling; Computer architecture; Computer networks; Mathematical model; Neural networks; Neurons; Parallel processing; Physics computing;
Conference_Titel :
Multiple-Valued Logic, 2000. (ISMVL 2000) Proceedings. 30th IEEE International Symposium on
Conference_Location :
Portland, OR
Print_ISBN :
0-7695-0692-5
DOI :
10.1109/ISMVL.2000.848602