DocumentCode :
3493310
Title :
VC dimension bounds for higher-order neurons
Author :
Schmitt, Michael
Author_Institution :
Lehrstuhl Math. & Inf., Ruhr-Univ., Bochum, Germany
Volume :
2
fYear :
1999
fDate :
1999
Firstpage :
563
Abstract :
We investigate the sample complexity for learning using higher-order neurons. We calculate upper and lower bounds on the Vapnik-Chervonenkis dimension and the pseudo dimension for higher-order neurons that allow unrestricted interactions among the input variables. In particular, we show that the degree of interaction is irrelevant for the VC dimension and that the individual degree of the variables plays only a minor role. Further, our results reveal that the crucial parameters that affect the VC dimension of higher-order neurons are the input dimension and the maximum number of occurrences of each variable. The lower bounds that we establish are asymptotically almost tight. In particular, they show that the VC dimension is superlinear in the input dimension. Bounds for higher-order neurons with sigmoidal activation function are also derived
Keywords :
neural nets; VC dimension bounds; Vapnik-Chervonenkis dimension; asymptotically almost tight lower bounds; high-order neurons; learning sample complexity; sigmoidal activation function; superlinear VC dimension; upper bounds;
fLanguage :
English
Publisher :
iet
Conference_Titel :
Artificial Neural Networks, 1999. ICANN 99. Ninth International Conference on (Conf. Publ. No. 470)
Conference_Location :
Edinburgh
ISSN :
0537-9989
Print_ISBN :
0-85296-721-7
Type :
conf
DOI :
10.1049/cp:19991169
Filename :
817989
Link To Document :
بازگشت