This chapter surveys recent work in applying ideas from graphical models and message passing algorithms to solve large-scale regularized regression problems. In particular, the focus is on compressed sensing reconstruction via 11 penalized least-squares (known as LASSO or BPDN). We discuss how to derive fast approximate message passing algorithms to solve this problem. Surprisingly, the analysis of such algorithms allows one to prove exact high-dimensional limit results for the LASSO risk.
Introduction
The problem of reconstructing a high-dimensional vector x ∈ ℝn from a collection of observations y ∈ ℝm arises in a number of contexts, ranging from statistical learning to signal processing. It is often assumed that the measurement process is approximately linear, i.e. that
where A ∈ ℝm×n is a known measurement matrix, and w is a noise vector.
The graphical models approach to such a reconstruction problem postulates a joint probability distribution on (x, y) which takes, without loss of generality, the form
The conditional distribution p(dy|x) models the noise process, while the prior p(dx) encodes information on the vector x. In particular, within compressed sensing, it can describe its sparsity properties. Within a graphical models approach, either of these distributions (or both) factorizes according to a specific graph structure. The resulting posterior distribution p(dx|y) is used for inferring x given y.
There are many reasons to be skeptical about the idea that the joint probability distribution p(dx, dy) can be determined, and used for reconstructing x.