Title :
Granular weights in a neural network
Author :
Dick, Scott ; Kandel, Abraham
Author_Institution :
Dept. of Comput. Sci. & Eng., Univ. of South Florida, Tampa, FL, USA
Abstract :
We investigate a mechanism for storing granular information in a neural network. The classic backpropagation network, which uses numeric connection weights, is known to be a universal approximator. However, training times for a backpropagation network can be very long, and the knowledge stored in these networks is exceptionally difficult to understand. Using granular weights can speed up network training and increase clarity, at the cost of some loss in accuracy. We describe our network architecture, called the granular neural network (GNN), which uses linguistic weights, as well as the rules of linguistic arithmetic, which were developed for this network. In an initial experiment, we found that the GNN completed training in an average of less than one tenth the number of epochs required by a backpropagation network on the Iris data set, when using a coarse granulation
Keywords :
computational linguistics; fuzzy neural nets; learning (artificial intelligence); neural net architecture; fuzzy neural networks; granular computing; granular weights; learning; linguistic arithmetic; neural network; Arithmetic; Backpropagation; Computer networks; Fuzzy neural networks; Fuzzy set theory; Fuzzy sets; Intelligent networks; Iris; Knowledge representation; Neural networks;
Conference_Titel :
IFSA World Congress and 20th NAFIPS International Conference, 2001. Joint 9th
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-7078-3
DOI :
10.1109/NAFIPS.2001.943809