Published online by Cambridge University Press: 02 August 2011
The classical nonstationary autoregressive models are both linear and Markov. They include unit root and cointegration models. A possible nonlinear extension is to relax the linearity and at the same time keep general properties such as nonstationarity and the Markov property. A null recurrent Markov chain is nonstationary, and β-null recurrence is of vital importance for statistical inference in nonstationary Markov models, such as, e.g., in nonparametric estimation in nonlinear cointegration within the Markov models. The standard random walk is an example of a null recurrent Markov chain.
In this paper we suggest that the concept of null recurrence is an appropriate nonlinear generalization of the linear unit root concept and as such it may be a starting point for a nonlinear cointegration concept within the Markov framework. In fact, we establish the link between null recurrent processes and autoregressive unit root models. It turns out that null recurrence is closely related to the location of the roots of the characteristic polynomial of the state space matrix and the associated eigenvectors. Roughly speaking the process is β-null recurrent if one root is on the unit circle, null recurrent if two distinct roots are on the unit circle, whereas the others are inside the unit circle. It is transient if there are more than two roots on the unit circle. These results are closely connected to the random walk being null recurrent in one and two dimensions but transient in three dimensions. We also give an example of a process that by appropriate adjustments can be made β-null recurrent for any β ∈ (0, 1) and can also be made null recurrent without being β-null recurrent.
We are very grateful to an anonymous referee for two extremely careful and detailed reviews of earlier versions of the paper. These reports have led to very substantial improvements of the paper. We also thank the chief editor Peter Phillips for a number of comments that have improved our presentation by putting our results in better perspective.