DocumentCode
3752192
Title
A nonexpansive operator for computationally efficient hierarchical convex optimization
Author
Masao Yamagishi;Isao Yamada
Author_Institution
Dept. of Communications and Computer Engineering, Tokyo Institute of Technology
fYear
2015
Firstpage
1100
Lastpage
1106
Abstract
Hybrid steepest descent method is an algorithmic solution to certain hierarchical convex optimization which is a class of two-stage optimization problems: the first stage problem is a convex optimization; the second stage problem is the minimization of a differentiable convex function over the solution set of the first stage problem. In the application of this method, the solution set of the first stage problem must be expressed as the fixed point set of a certain nonexpansive operator. In this paper, we propose a nonexpansive operator that yields a computationally efficient update in cases where it is plugged into the hybrid steepest descent method. The proposed operator is applicable to characterize the solution set of recent sophisticated convex optimization problems, where multiple proximable convex functions involving linear operators must be minimized. To the best of our knowledge, for such a problem, there was not reported any nonexpansive operator that yields an update free from the inversions of linear operators in cases where it is utilized in the hybrid steepest descent method. Unlike conventional operators, the proposed operator yields an inversion-free update.
Publisher
ieee
Conference_Titel
Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2015 Asia-Pacific
Type
conf
DOI
10.1109/APSIPA.2015.7415442
Filename
7415442
Link To Document