DocumentCode
2515225
Title
Fast Training of Object Detection Using Stochastic Gradient Descent
Author
Wijnhoven, Rob G J ; De With, Peter H N
Author_Institution
ViNotion BV, Eindhoven, Netherlands
fYear
2010
fDate
23-26 Aug. 2010
Firstpage
424
Lastpage
427
Abstract
Training datasets for object detection problems are typically very large and Support Vector Machine (SVM) implementations are computationally complex. As opposed to these complex techniques, we use Stochastic Gradient Descent (SGD) algorithms that use only a single new training sample in each iteration and process samples in a stream-like fashion. We have incorporated SGD optimization in an object detection framework. The object detection problem is typically highly asymmetric, because of the limited variation in object appearance, compared to the background. Incorporating SGD speeds up the optimization process significantly, requiring only a single iteration over the training set to obtain results comparable to state-of-the-art SVM techniques. SGD optimization is linearly scalable in time and the obtained speedup in computation time is two to three orders of magnitude. We show that by considering only part of the total training set, SGD converges quickly to the overall optimum.
Keywords
gradient methods; object detection; stochastic processes; support vector machines; SGD; SVM; fast training; object detection; optimization process; stochastic gradient descent; support vector machine; training datasets; Computer vision; Feature extraction; Object detection; Optimization; Pattern recognition; Support vector machines; Training; HOG; SVM; classification; detection; histogram of oriented gradients; object recognition; stochastic gradient descent;
fLanguage
English
Publisher
ieee
Conference_Titel
Pattern Recognition (ICPR), 2010 20th International Conference on
Conference_Location
Istanbul
ISSN
1051-4651
Print_ISBN
978-1-4244-7542-1
Type
conf
DOI
10.1109/ICPR.2010.112
Filename
5597822
Link To Document