Title :
Sparse posterior probability support vector machines
Author :
Dongli Wang ; Yan Zhou
Author_Institution :
Coll. of Inf. Eng., Xiangtan Univ., Xiangtan, China
fDate :
June 29 2014-July 2 2014
Abstract :
Posterior probability support vector machines (PPSVMs) are proved to have good generalization performance and robustness against outliers. However, the disadvantage of a PPSVM is lack of sparseness of solution, i.e., the number of support vectors is still too large. This results in high computational burden and decision time. In this paper, we present two approaches to obtain sparse PPSVMs, which are expected to combine benefits of both PPSVMs and sparse classifiers. The first approach sparsifies the PPSVMs by adding l1 norm penalties on the dual cost function of soft margin PPSVMs. The second one handles a mixed l1-l2 multi-objective optimization by interior-point algorithm. Simulation results show that both approaches have good generalization performance, good robustness against outliers, and high efficiency on decision evaluation.
Keywords :
optimisation; pattern classification; probability; support vector machines; decision evaluation; dual cost function; interior-point algorithm; l1 norm penalties; mixed l1-l2 multiobjective optimization; outliers; soft margin PPSVMs; sparse classifiers; sparse posterior probability support vector machines; Accuracy; Kernel; Optimization; Probability; Robustness; Support vector machines; Training; Support vector machine; compressed sensing; posterior probability; sparse representation;
Conference_Titel :
Statistical Signal Processing (SSP), 2014 IEEE Workshop on
Conference_Location :
Gold Coast, VIC
DOI :
10.1109/SSP.2014.6884659