Title :
Learning with prior information
Author :
Campi, M.C. ; Vidyasagar, M.
Author_Institution :
Dept. of Electr. Eng. & Autom., Brescia Univ., Italy
fDate :
11/1/2001 12:00:00 AM
Abstract :
A notion of learnability is introduced, referred to as learnability with prior information (w.p.i.). This notion is weaker than the standard notion of probably approximately correct (PAC) learnability. A property called "dispersability" is introduced, and it is shown that dispersability plays a key role in the study of learnability w.p.i. Specifically, dispersability of a function class is always a sufficient condition for the function class to be learnable; moreover, in the case of concept classes, dispersability is also a necessary condition for learnability w.p.i. Thus in the case of learnability w.p.i., the dispersability property plays a role similar to the finite metric entropy condition in the case of PAC learnability with a fixed distribution. Next, the notion of learnability w.p.i. is extended to the distribution-free (d.f.) situation, and it is shown that a property called d.f. dispersability is always a sufficient condition for d.f. learnability w.p.i., and is also a necessary condition for d.f. learnability in the case of concept classes. The approach to learning introduced in the paper is believed to be significant in all problems where a nonlinear system has to be designed based on data. This includes direct inverse control and system identification
Keywords :
learning (artificial intelligence); nonlinear systems; probability; concept classes; direct inverse control; dispersability; distribution-free learnability; learning with prior information; necessary condition; nonlinear system; sufficient condition; system identification; Architecture; Artificial intelligence; Control systems; Entropy; Extraterrestrial measurements; Nonlinear systems; Robotics and automation; Service robots; Sufficient conditions; System identification;
Journal_Title :
Automatic Control, IEEE Transactions on