Published online by Cambridge University Press: 20 March 2018
In this paper we investigate gradient estimation for a class of contracting stochastic systems on a continuous state space. We find conditions on the one-step transitions, namely differentiability and contraction in a Wasserstein distance, that guarantee differentiability of stationary costs. Then we show how to estimate the derivatives, deriving an estimator that can be seen as a generalization of the forward sensitivity analysis method used in deterministic systems. We apply the results to examples, including a neural network model.