DocumentCode :
3059368
Title :
Bias-variance tradeoff in hybrid generative-discriminative models
Author :
Bouchard, Guillaume
Author_Institution :
Xerox Res. Center Eur., Meylan
fYear :
2007
fDate :
13-15 Dec. 2007
Firstpage :
124
Lastpage :
129
Abstract :
Given any generative classifier based on an inexact density model, we can define a discriminative counterpart that reduces its asymptotic error rate, while increasing the estimation variance. An optimal bias-variance balance might be found using hybrid generative-discriminative (HGD) approaches. In these paper, these methods are defined in a unified framework. This allow us to find sufficient conditions under which an improvement in generalization performances is guaranteed. Numerical experiments illustrate the well fondness of our statements.
Keywords :
learning (artificial intelligence); pattern classification; asymptotic error rate; bias-variance tradeoff; generative classifier; hybrid generative-discriminative models; Error analysis; Europe; Hybrid power systems; Machine learning; Maximum a posteriori estimation; Parameter estimation; Predictive models; Robustness; Statistics; Sufficient conditions;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning and Applications, 2007. ICMLA 2007. Sixth International Conference on
Conference_Location :
Cincinnati, OH
Print_ISBN :
978-0-7695-3069-7
Type :
conf
DOI :
10.1109/ICMLA.2007.85
Filename :
4457219
Link To Document :
بازگشت