DocumentCode
671763
Title
Optimizing F-measure with non-convex loss and sparse linear classifiers
Author
Chinta, Punya Murthy ; Balamurugan, P. ; Shevade, Shirish ; Murty, M. Narasimha
Author_Institution
Amazon Dev. Center, Hyderabad, India
fYear
2013
fDate
4-9 Aug. 2013
Firstpage
1
Lastpage
8
Abstract
F-measure is a popular performance metric used in classification when the dataset is unbalanced. Optimizing this measure directly is often challenging since no closed form solution exists. Current algorithms use approximations to the F-measure and design classifiers using maximum margin or logistic regression framework. These algorithms are not scalable and the classifiers designed are not robust to outliers. In this work, we propose a general framework for approximate F-measure maximization. We also propose a non-convex loss function which is robust to outliers. Use of elastic net regularizer in the problem formulation enables us to do simultaneous classifier design and feature selection. We present an efficient algorithm to solve the proposed problem formulation. The proposed algorithm is simple and is easy to implement. Numerical experiments on real-world benchmark datasets demonstrate that the proposed algorithm is fast and gives better generalization performance compared to some existing approaches. Thus, it is a powerful alternative for optimizing F-measure and designing a sparse classifier.
Keywords
optimisation; pattern classification; regression analysis; F-measure maximization; classifier design; elastic net regularizer; feature selection; generalization performance; logistic regression framework; maximum margin; nonconvex loss function; performance metric; real world benchmark datasets; sparse classifier; sparse linear classifiers; Algorithm design and analysis; Approximation algorithms; Approximation methods; Fasteners; Optimization; Training; Vectors;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks (IJCNN), The 2013 International Joint Conference on
Conference_Location
Dallas, TX
ISSN
2161-4393
Print_ISBN
978-1-4673-6128-6
Type
conf
DOI
10.1109/IJCNN.2013.6707105
Filename
6707105
Link To Document