DocumentCode :
2769509
Title :
Vector-Valued Support Vector Regression
Author :
Brudnak, Mark
Author_Institution :
Member, IEEE, researcher, U.S. Army RDECOM-TARDEC, 6501 E. 11 Mile Road, Warren, MI 48397-5000, USA. phone: 586-574-7355; email: brudnakm@tacom.army.mil
fYear :
0
fDate :
0-0 0
Firstpage :
1562
Lastpage :
1569
Abstract :
A vector-valued extension of the support vector regression problem is presented here. The vector-valued variant is developed by extending the notions of the estimator, loss function and regularization functional from the scalar-valued case. A particular emphasis is placed on the class of loss functions chosen which apply the epsiv-insensitive loss function to the p-norm of the error. The primal and dual optimization problems are derived and the KKT conditions are developed. The general case for the p-norm is specialized for the 1-, 2- and p-norms. It is shown that the vector-valued variant is a true extension of the scalar-valued case. It is then shown that the vector-valued approach results in sparse representations in terms of support vectors as compared to aggregated scalar-valued learning.
Keywords :
learning (artificial intelligence); support vector machines; loss function; regularization functional; scalar-valued learning; sparse representations; vector-valued support vector regression; vector-valued variant; Costs; Hilbert space; Kernel; Kinematics; Learning systems; Optimization methods; Quaternions; Support vector machines;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
Type :
conf
DOI :
10.1109/IJCNN.2006.246619
Filename :
1716292
Link To Document :
بازگشت