• DocumentCode
    358905
  • Title

    Stochastic approximation for global random optimization

  • Author

    Maryak, John L. ; Chin, Daniel C.

  • Author_Institution
    Appl. Phys. Lab., Johns Hopkins Univ., Laurel, MD, USA
  • Volume
    5
  • fYear
    2000
  • fDate
    2000
  • Firstpage
    3294
  • Abstract
    A desire with iterative optimization techniques is that the algorithm reach the global optimum rather than get stranded at a local optimum value. One method used to try to assure global convergence is the injection of extra noise terms into the recursion, which may allow the algorithm to escape local optimum points. The amplitude of the injected noise is decreased over time (a process called “annealing”), so that the algorithm can finally converge when it reaches the global optimum point. In this context, we examine a certain “gradient free” stochastic approximation algorithm called “SPSA,” that has performed well in complex optimization problems. We discuss conditions under which SPSA will converge globally using injected noise. In a separate section, we show that, under different conditions, “basic” SPSA (i.e., without injected noise) can achieve a standard type of convergence to a global optimum. The discussion is supported by a numerical study
  • Keywords
    approximation theory; convergence; iterative methods; optimisation; SPSA; annealing; complex optimization problems; global optimum; global random optimization; injected noise; iterative optimization techniques; stochastic approximation; Annealing; Approximation algorithms; Convergence; Iterative algorithms; Laboratories; Loss measurement; Noise level; Physics; Stochastic processes; Stochastic resonance;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    American Control Conference, 2000. Proceedings of the 2000
  • Conference_Location
    Chicago, IL
  • ISSN
    0743-1619
  • Print_ISBN
    0-7803-5519-9
  • Type

    conf

  • DOI
    10.1109/ACC.2000.879174
  • Filename
    879174