Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization

05/28/2019
by   Yifan Hu, et al.
0

In this paper, we study a class of stochastic optimization problems, referred to as the Conditional Stochastic Optimization (CSO), in the form of _x ∈XE_ξf_ξ(E_η|ξ[g_η(x,ξ)]). CSO finds a wide spectrum of applications including portfolio selection, reinforcement learning, robust and invariant learning. We establish the sample complexity of the sample average approximation (SAA) for CSO, under a variety of structural assumptions, such as Lipschitz continuity, smoothness, and error bound conditions. We show that the total sample complexity improves from O(d/ϵ^4) to O(d/ϵ^3) when assuming smoothness of the outer function, and further to O(1/ϵ^2) when the empirical function satisfies the quadratic growth condition. We also establish the sample complexity of a modified SAA, when ξ and η are independent. Our numerical results from several experiments further support our theoretical findings. Keywords: stochastic optimization, sample average approximation, large deviations theory

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro