DocumentCode :
1203532
Title :
A Neyman-Pearson approach to statistical learning
Author :
Scott, Clayton ; Nowak, Robert
Author_Institution :
Dept. of Stat., Rice Univ., Houston, TX, USA
Volume :
51
Issue :
11
fYear :
2005
Firstpage :
3806
Lastpage :
3819
Abstract :
The Neyman-Pearson (NP) approach to hypothesis testing is useful in situations where different types of error have different consequences or a priori probabilities are unknown. For any α>0, the NP lemma specifies the most powerful test of size α, but assumes the distributions for each hypothesis are known or (in some cases) the likelihood ratio is monotonic in an unknown parameter. This paper investigates an extension of NP theory to situations in which one has no knowledge of the underlying distributions except for a collection of independent and identically distributed (i.i.d.) training examples from each hypothesis. Building on a "fundamental lemma" of Cannon et al., we demonstrate that several concepts from statistical learning theory have counterparts in the NP context. Specifically, we consider constrained versions of empirical risk minimization (NP-ERM) and structural risk minimization (NP-SRM), and prove performance guarantees for both. General conditions are given under which NP-SRM leads to strong universal consistency. We also apply NP-SRM to (dyadic) decision trees to derive rates of convergence. Finally, we present explicit algorithms to implement NP-SRM for histograms and dyadic decision trees.
Keywords :
convergence of numerical methods; decision trees; error analysis; learning (artificial intelligence); minimisation; probability; signal classification; ERM; NP theory; Neyman-Pearson approach; SRM; a priori probability; convergence rate; dyadic decision tree; empirical risk minimization; explicit algorithm; generalization error bound; histogram; hypothesis testing; i.i.d.; independent-identically distributed training; monotonic likelihood ratio; signal classification; statistical learning; structural risk minimization; Buildings; Condition monitoring; Convergence; Decision trees; Filtering; Histograms; Probability; Risk management; Statistical learning; Testing; Generalization error bounds; Neyman–Pearson (NP) classification; statistical learning theory;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2005.856955
Filename :
1522642
Link To Document :
بازگشت