Title :
Normalization of Linear Support Vector Machines
Author :
Yiyong Feng ; Palomar, Daniel P.
Author_Institution :
Dept. of Electron. & Comput. Eng., Hong Kong Univ. of Sci. & Technol., Kowloon, China
Abstract :
In this paper, we start with the standard support vector machine (SVM) formulation and extend it by considering a general SVM formulation with normalized margin. This results in a unified convex framework that allows many different variations in the formulation with very diverse numerical performance. The proposed unified framework can capture the existing methods, i.e., standard soft-margin SVM, l1-SVM, and SVMs with standardization, feature selection, scaling, and many more SVMs, as special cases. Furthermore, our proposed framework can not only provide us with more insights on different SVMs from the “energy” and “penalty” point of views, which help us understand the connections and differences between them in a unified way, but also enable us to propose more SVMs that outperform the existing ones under some scenarios.
Keywords :
convex programming; feature extraction; support vector machines; feature scaling; feature selection; l1-SVM; linear support vector machine normalization; normalized margin; standard soft-margin SVM; unified convex framework; Convex functions; Linear matrix inequalities; Standards; Support vector machines; Training; Training data; Convex optimization; normalizations; support vector machines; unified framework;
Journal_Title :
Signal Processing, IEEE Transactions on
DOI :
10.1109/TSP.2015.2443730