Title :
VC dimension bounds for higher-order neurons
Author :
Schmitt, Michael
Author_Institution :
Lehrstuhl Math. & Inf., Ruhr-Univ., Bochum, Germany
Abstract :
We investigate the sample complexity for learning using higher-order neurons. We calculate upper and lower bounds on the Vapnik-Chervonenkis dimension and the pseudo dimension for higher-order neurons that allow unrestricted interactions among the input variables. In particular, we show that the degree of interaction is irrelevant for the VC dimension and that the individual degree of the variables plays only a minor role. Further, our results reveal that the crucial parameters that affect the VC dimension of higher-order neurons are the input dimension and the maximum number of occurrences of each variable. The lower bounds that we establish are asymptotically almost tight. In particular, they show that the VC dimension is superlinear in the input dimension. Bounds for higher-order neurons with sigmoidal activation function are also derived
Keywords :
neural nets; VC dimension bounds; Vapnik-Chervonenkis dimension; asymptotically almost tight lower bounds; high-order neurons; learning sample complexity; sigmoidal activation function; superlinear VC dimension; upper bounds;
Conference_Titel :
Artificial Neural Networks, 1999. ICANN 99. Ninth International Conference on (Conf. Publ. No. 470)
Conference_Location :
Edinburgh
Print_ISBN :
0-85296-721-7
DOI :
10.1049/cp:19991169