We investigate in this paper the stability of non-stationary stochastic processes, arising typically in applications of control. The setting is known as stochastic recursive sequences, which allows us to construct on one probability space stochastic processes that correspond to different initial states and even different control policies. It does not require any Markovian assumptions. A natural criterion for stability for such processes is that the influence of the initial state disappears after some finite time; in other words, starting from different initial states, the process will couple after some finite time to the same limiting (not necessarily stationary nor ergodic) stochastic process. We investigate this as well as other types of coupling, and present conditions for them to occur uniformly in some class of control policies. We then use the coupling results to establish new theoretical aspects in the theory of non-Markovian control.