Title :
Bringing diverse classifiers to common grounds: dtransform
Author :
Parikh, Devi ; Chen, Tsuhan
Author_Institution :
Dept. of Electr. & ComputerEngineering, Carnegie Mellon Univ., Pittsburgh, PA
fDate :
March 31 2008-April 4 2008
Abstract :
Several classification scenarios employ multiple independently trained classifiers and the outputs of these classifiers need to be combined. However, since each of the trained classifiers exhibit different statistical characteristics, it is not appropriate to combine them using techniques that are blind to these differences. We propose a transform, dtransform, that transforms outputs of classifiers to approximate posterior probabilities, and caters to the statistical behavior of the classifier while doing so. The transformed outputs are now comparable, and can be combined using any of the classical combination rules. We show convincing results that demonstrate the effectiveness of the proposed transform in providing better estimates of the posterior probabilities as compared to standard transformations, as demonstrated by lower KL distance from the true distribution, higher classification accuracies and higher effectiveness of the standard classifier combination rules.
Keywords :
pattern classification; probability; approximate posterior probability; classifier combination; diverse classifiers; dtransform; intrusion detection; parametric transformation; Cost function; Detectors; Intrusion detection; Multi-layer neural network; Multilayer perceptrons; Neural networks; Probability; Support vector machine classification; Support vector machines; Testing; combining classifiers; dtransform; estimating posterior probabilities; intrusion detection; parametric transformation of classifier outputs;
Conference_Titel :
Acoustics, Speech and Signal Processing, 2008. ICASSP 2008. IEEE International Conference on
Conference_Location :
Las Vegas, NV
Print_ISBN :
978-1-4244-1483-3
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2008.4518368