Title :
Optimal bounds on approximation of submodular and XOS functions by juntas
Author :
Feldman, Vitaly ; Vondrak, Jan
Author_Institution :
IBM Res. - Almaden, San Jose, CA, USA
Abstract :
We investigate the approximability of several classes of real-valued functions by functions of a small number of variables (juntas). Our main results are tight bounds on the number of variables required to approximate a function f : {0, 1}n → [0, 1] within ℓ2-error ε over the uniform distribution: If f is submodular, then it is ε-close to a function of O(1/e2) variables. This is an exponential improvement over previously known results [1]. We note that Ω(1/e2) variables are necessary even for linear functions. If f is fractionally subadditive (XOS) it is ε-close to a function of 2O(1/e2) variables. This result holds for all functions with low total ℓ1-influence and is a real-valued analogue of Friedgut´s theorem for boolean functions. We show that 2Ω(1/e) variables are necessary even for XOS functions. As applications of these results, we provide learning algorithms over the uniform distribution. For XOS functions, we give a PAC learning algorithm that runs in time 21/poly(∈)poly(n). For submodular functions we give an algorithm in the more demanding PMAC learning model [2] which requires a multiplicative (1 + γ) factor approximation with probability at least 1 - ∈ over the target distribution. Our uniform distribution algorithm runs in time 21/poly(γ∈)poly(n). This is the first algorithm in the PMAC model that can achieve a constant approximation factor arbitrarily close to 1 for all submodular functions (even over the uniform distribution). It relies crucially on our approximation by junta result. As follows from the lower bounds in [1] both of these algorithms are close to optimal. We also give applications for proper learning, testing and agnostic learning with value queries of these classes.
Keywords :
Boolean functions; approximation theory; learning (artificial intelligence); probability; Boolean function; Friedgut´s theorem; PAC learning algorithm; PMAC learning model; PMAC model; XOS function; constant approximation factor; exponential improvement; juntas; learning algorithms; linear functions; multiplicative factor approximation; optimal bounds; probability; real-valued analogue; real-valued function; subadditive function; submodular function; uniform distribution algorithm; Additives; Approximation algorithms; Approximation methods; Boolean functions; Cost accounting; Game theory; Testing; PAC learning; approximation; fractionally-subadditive; junta; submodular; testing;
Conference_Titel :
Information Theory and Applications Workshop (ITA), 2014
Conference_Location :
San Diego, CA
DOI :
10.1109/ITA.2014.6804263