• DocumentCode
    2495474
  • Title

    Exploration vs. exploitation in active learning : A Bayesian approach

  • Author

    Bondu, A. ; Lemaire, V. ; Boullé, M.

  • Author_Institution
    EDF R&D, Clamart, France
  • fYear
    2010
  • fDate
    18-23 July 2010
  • Firstpage
    1
  • Lastpage
    7
  • Abstract
    The labeling of training examples could be a costly task in numerous cases of supervised learning. Active learning strategies address this problem and select unlabeled examples which are considered as the most useful for the training of a predictive model. The choice of examples to be labeled can be considered as a dilemma between the exploration and the exploitation of the input data space. In this article, a new active learning strategy that manages this compromise is proposed. This strategy is based on a Bayesian formalism that minimizes assumptions on data. An experimental validation is conducted on a unidimensional dataset, the objective is to assess the position of a step function from noisy examples. Our approach is favorably compared to an ad hoc strategy : the probabilistic dichotomy.
  • Keywords
    belief networks; formal logic; learning (artificial intelligence); probability; Bayesian formalism; active learning; probabilistic dichotomy; supervised learning; Data models; Labeling; Noise; Noise measurement; Predictive models; Probabilistic logic; Training;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks (IJCNN), The 2010 International Joint Conference on
  • Conference_Location
    Barcelona
  • ISSN
    1098-7576
  • Print_ISBN
    978-1-4244-6916-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.2010.5596815
  • Filename
    5596815