DocumentCode :
1318532
Title :
Sample complexity for learning recurrent perceptron mappings
Author :
DasGupta, Bhaskar ; Sontag, Eduardo D.
Author_Institution :
Dept. of Comput. Sci., Waterloo Univ., Ont., Canada
Volume :
42
Issue :
5
fYear :
1996
fDate :
9/1/1996 12:00:00 AM
Firstpage :
1479
Lastpage :
1487
Abstract :
Recurrent perceptron classifiers generalize the usual perceptron model. They correspond to linear transformations of input vectors obtained by means of “autoregressive moving-average schemes”, or infinite impulse response filters, and take into account those correlations and dependences among input coordinates which arise from linear digital filtering. This paper provides tight bounds on the sample complexity associated to the fitting of such models to experimental data. The results are expressed in the context of the theory of probably approximately correct (PAC) learning
Keywords :
IIR filters; autoregressive moving average processes; correlation methods; digital filters; filtering theory; learning (artificial intelligence); multilayer perceptrons; pattern classification; recurrent neural nets; signal sampling; autoregressive moving-average; correlations; experimental data; infinite impulse response filters; input coordinates; input vectors; linear digital filtering; linear transformations; perceptron model; probably approximately correct learning; recurrent perceptron classifiers; recurrent perceptron mappings; sample complexity; tight bounds; Digital filters; Filtering; IIR filters; Information processing; Input variables; Linear programming; Neural networks; Nonlinear filters; Recurrent neural networks; Vectors;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.532888
Filename :
532888
Link To Document :
بازگشت