Book contents
- Frontmatter
- Contents
- List of contributors
- Preface
- 1 Automatic code generation for real-time convex optimization
- 2 Gradient-based algorithms with applications to signal-recovery problems
- 3 Graphical models of autoregressive processes
- 4 SDP relaxation of homogeneous quadratic optimization: approximation bounds and applications
- 5 Probabilistic analysis of semidefinite relaxation detectors for multiple-input, multiple-output systems
- 6 Semidefinite programming, matrix decomposition, and radar code design
- 7 Convex analysis for non-negative blind source separation with application in imaging
- 8 Optimization techniques in modern sampling theory
- 9 Robust broadband adaptive beamforming using convex optimization
- 10 Cooperative distributed multi-agent optimization
- 11 Competitive optimization of cognitive radio MIMO systems via game theory
- 12 Nash equilibria: the variational approach
- Afterword
- Index
3 - Graphical models of autoregressive processes
Published online by Cambridge University Press: 23 February 2011
- Frontmatter
- Contents
- List of contributors
- Preface
- 1 Automatic code generation for real-time convex optimization
- 2 Gradient-based algorithms with applications to signal-recovery problems
- 3 Graphical models of autoregressive processes
- 4 SDP relaxation of homogeneous quadratic optimization: approximation bounds and applications
- 5 Probabilistic analysis of semidefinite relaxation detectors for multiple-input, multiple-output systems
- 6 Semidefinite programming, matrix decomposition, and radar code design
- 7 Convex analysis for non-negative blind source separation with application in imaging
- 8 Optimization techniques in modern sampling theory
- 9 Robust broadband adaptive beamforming using convex optimization
- 10 Cooperative distributed multi-agent optimization
- 11 Competitive optimization of cognitive radio MIMO systems via game theory
- 12 Nash equilibria: the variational approach
- Afterword
- Index
Summary
We consider the problem of fitting a Gaussian autoregressive model to a time series, subject to conditional independence constraints. This is an extension of the classical covariance selection problem to time series. The conditional independence constraints impose a sparsity pattern on the inverse of the spectral density matrix, and result in nonconvex quadratic equality constraints in the maximum likelihood formulation of the model estimation problem. We present a semidefinite relaxation, and prove that the relaxation is exact when the sample covariance matrix is block-Toeplitz. We also give experimental results suggesting that the relaxation is often exact when the sample covariance matrix is not block-Toeplitz. In combination with model selection criteria the estimation method can be used for topology selection. Experiments with randomly generated and several real data sets are also included.
Introduction
Graphical models give a graph representation of relations between random variables. The simplest example is a Gaussian graphical model, in which an undirected graph with n nodes is used to describe conditional independence relations between the components of an n-dimensional random variable x ~ N(0, ∑). The absence of an edge between two nodes of the graph indicates that the corresponding components of x are independent, conditional on the other components. Other common examples of graphical models include contingency tables, which describe conditional independence relations in multinomial distributions, and Bayesian networks, which use directed acyclic graphs to represent causal or temporal relations.
- Type
- Chapter
- Information
- Convex Optimization in Signal Processing and Communications , pp. 89 - 116Publisher: Cambridge University PressPrint publication year: 2009
- 14
- Cited by