Title :
Enhanced lower entropy bounds with application to constructive learning
Author_Institution :
Div. NIS-1, Los Alamos Nat. Lab., NM, USA
Abstract :
We prove two new lower bounds for the number of bits required by neural networks for classification problems defined by m examples from IR/sup n/. They are obtained in a constructive way, and can be used for designing constructive learning algorithms. The results rely on upper bounding the space, with an n dimensional ball (V. Beiu, 1996). Recently, a better upper bound was presented by V. Beiu and T. DePauw, 1997) by showing that the volume of the ball can always be replaced by the volume of the intersection of two balls. A lower bound for the case of integer weights in the range [-p,p] was detailed by S. Draghici and I.K. Sethi (1997); it is based on computing the logarithm of the quotient between the volume of the ball containing all the examples and the maximum volume of a polyhedron. One first improvement will come from a tighter upper bound on the maximum volume of the polyhedron by two n dimensional cones (instead of the ball). A second-even better-bound will be obtained by upper bounding the space by the intersection of two balls.
Keywords :
entropy; learning (artificial intelligence); neural nets; pattern classification; IR/sup n/; classification problems; constructive learning algorithms; enhanced lower entropy bounds; integer weights; logarithm; maximum volume; n dimensional ball; n dimensional cones; neural networks; polyhedron; quotient; upper bounding; Computer networks; Electronic mail; Entropy; Feedforward neural networks; Laboratories; Multi-layer neural network; Neural networks; Neurons; Postal services; Upper bound;
Conference_Titel :
EUROMICRO 97. New Frontiers of Information Technology., Proceedings of the 23rd EUROMICRO Conference
Conference_Location :
Budapest, Hungary
Print_ISBN :
0-8186-8129-2
DOI :
10.1109/EURMIC.1997.617371