Title of article
Asymptotic formulas for the derivatives of probability functions and their Monte Carlo estimations
Author/Authors
Josselin Garnier، نويسنده , , Abdennebi Omrane، نويسنده , , Youssef Rouchdy، نويسنده ,
Issue Information
روزنامه با شماره پیاپی سال 2009
Pages
11
From page
848
To page
858
Abstract
One of the key problems in chance constrained programming for nonlinear optimization problems is the evaluation of derivatives of joint probability functions of the form P(x)=P(gp(x,Λ)⩽cp,p=1,…,Nc)P(x)=P(gp(x,Λ)⩽cp,p=1,…,Nc). Here x∈RNxx∈RNx is the vector of physical parameters, Λ∈RNΛΛ∈RNΛ is a random vector describing the uncertainty of the model, g:RNx×RNΛ→RNcg:RNx×RNΛ→RNc is the constraints mapping, and c∈RNcc∈RNc is the vector of constraint levels. In this paper specific Monte Carlo tools for the estimations of the gradient and Hessian of P(x)P(x) are proposed when the input random vector ΛΛ has a multivariate normal distribution and small variances. Using the small variance hypothesis, approximate expressions for the first- and second-order derivatives are obtained, whose Monte Carlo estimations have low computational costs. The number of calls of the constraints mapping g for the proposed estimators of the gradient and Hessian of P(x)P(x) is only 1+2Nx+2NΛ1+2Nx+2NΛ.
These tools are implemented in penalized optimization routines adapted to stochastic optimization, and are shown to reduce the computational cost of chance constrained programming substantially.
Keywords
Monte Carlo methods , Optimization with constraints , Random constraints , Stochastic programming , Applied probability
Journal title
European Journal of Operational Research
Serial Year
2009
Journal title
European Journal of Operational Research
Record number
1313954
Link To Document