DocumentCode
3166164
Title
Bandit-Based Algorithms for Budgeted Learning
Author
Deng, Kun ; Bourke, Chris ; Scott, Stephen ; Sunderman, Julie ; Zheng, Yaling
Author_Institution
Univ. of Nebraska-Lincoln, Lincoln
fYear
2007
fDate
28-31 Oct. 2007
Firstpage
463
Lastpage
468
Abstract
We explore the problem of budgeted machine learning, in which the learning algorithm has free access to the training examples´ labels but has to pay for each attribute that is specified. This learning model is appropriate in many areas, including medical applications. We present new algorithms for choosing which attributes to purchase of which examples in the budgeted learning model based on algorithms for the multi-armed bandit problem. All of our approaches outperformed the current state of the art. Furthermore, we present a new means for selecting an example to purchase after the attribute is selected, instead of selecting an example uniformly at random, which is typically done. Our new example selection method improved performance of all the algorithms we tested, both ours and those in the literature.
Keywords
learning (artificial intelligence); bandit-based algorithm; budgeted machine learning problem; multi armed bandit problem; Biomedical equipment; Computer science; Costs; Data engineering; Data mining; Machine learning; Machine learning algorithms; Medical services; Testing; USA Councils;
fLanguage
English
Publisher
ieee
Conference_Titel
Data Mining, 2007. ICDM 2007. Seventh IEEE International Conference on
Conference_Location
Omaha, NE
ISSN
1550-4786
Print_ISBN
978-0-7695-3018-5
Type
conf
DOI
10.1109/ICDM.2007.91
Filename
4470274
Link To Document