• DocumentCode
    180810
  • Title

    Private Empirical Risk Minimization: Efficient Algorithms and Tight Error Bounds

  • Author

    Bassily, Raef ; Smith, A. ; Thakurta, Abhradeep

  • fYear
    2014
  • fDate
    18-21 Oct. 2014
  • Firstpage
    464
  • Lastpage
    473
  • Abstract
    Convex empirical risk minimization is a basic tool in machine learning and statistics. We provide new algorithms and matching lower bounds for differentially private convex empirical risk minimization assuming only that each data point´s contribution to the loss function is Lipschitz and that the domain of optimization is bounded. We provide a separate set of algorithms and matching lower bounds for the setting in which the loss functions are known to also be strongly convex. Our algorithms run in polynomial time, and in some cases even match the optimal nonprivate running time (as measured by oracle complexity). We give separate algorithms (and lower bounds) for (ε, 0)and (ε, δ)-differential privacy; perhaps surprisingly, the techniques used for designing optimal algorithms in the two cases are completely different. Our lower bounds apply even to very simple, smooth function families, such as linear and quadratic functions. This implies that algorithms from previous work can be used to obtain optimal error rates, under the additional assumption that the contributions of each data point to the loss function is smooth. We show that simple approaches to smoothing arbitrary loss functions (in order to apply previous techniques) do not yield optimal error rates. In particular, optimal algorithms were not previously known for problems such as training support vector machines and the high-dimensional median.
  • Keywords
    computational complexity; convex programming; learning (artificial intelligence); minimisation; (ε, δ)-differential privacy; (ε, 0)-differential privacy; Lipschitz loss function; arbitrary loss function smoothing; machine learning; optimal nonprivate running time; oracle complexity; polynomial time; private convex empirical risk minimization; smooth function families; statistics; Algorithm design and analysis; Convex functions; Noise measurement; Optimization; Privacy; Risk management; Support vector machines;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Foundations of Computer Science (FOCS), 2014 IEEE 55th Annual Symposium on
  • Conference_Location
    Philadelphia, PA
  • ISSN
    0272-5428
  • Type

    conf

  • DOI
    10.1109/FOCS.2014.56
  • Filename
    6979031