DocumentCode :
324542
Title :
Contour preserving classification for maximal reliability
Author :
Tanprasert, Thitipong ; Tanprasert, Chularat ; Lursinsap, Chidchanok
Author_Institution :
Assumption Univ., Thailand
Volume :
2
fYear :
1998
fDate :
4-9 May 1998
Firstpage :
1125
Abstract :
This paper demonstrates that the robustness and weight fault tolerance of a neural network trained to learn a linearly separable problem can be enhanced if the network classifies the problem nonlinearly. However, a multilayer perceptron type network trained by an existing learning algorithm can normally promote the linear separability of the problem, resulting in nonoptimal solution in terms of robustness and fault tolerance. In this paper, the technique for forcing the network to optimize its internal representation towards robustness and fault tolerance is presented. The technique introduces a concept of “outpost” vectors for hiding the unwanted linearly separable characteristics of problem. Since such a task is rather specific, the “outpost” vectors are deterministically determined rather than randomized
Keywords :
backpropagation; edge detection; fault tolerant computing; multilayer perceptrons; pattern classification; backpropagation; contour preserving classification; fault tolerance; learning algorithm; multilayer perceptron; pattern classification; Circuits; Fault tolerance; Multilayer perceptrons; Neural network hardware; Neural networks; Neurons; Redundancy; Robustness; Vectors; Very large scale integration;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
ISSN :
1098-7576
Print_ISBN :
0-7803-4859-1
Type :
conf
DOI :
10.1109/IJCNN.1998.685930
Filename :
685930
Link To Document :
بازگشت