The

-distance between posterior density functions (PDF\´s) is proposed as a separability measure to replace the probability of error as a criterion for feature extraction in pattern recognition. Upper and lower bounds on Bayes error are derived for

. If

, the lower and upper bounds coincide; an increase (or decrease) in

loosens these bounds. For

, the upper bound equals the best commonly used bound and is equal to the asymptotic probability of error of the first nearest neighbor classifier. The case when

is used for estimation of the probability of error in different problem situations, and a comparison is made with other methods. It is shown how unclassified samples may also be used to improve the variance of the estimated error. For the family of exponential probability density functions (pdf\´s), the relation between the distance of a sample from the decision boundary and its contribution to the error is derived. In the nonparametric case, a consistent estimator is discussed which is computationally more efficient than estimators based on Parzen\´s estimation. A set of computer simulation experiments are reported to demonstrate the statistical advantages of the separability measure with

when used in an error estimation scheme.