Minimum Variance Estimation of a Sparse Vector Within the Linear Gaussian Model: An RKHS Approach

Free registration required

Executive Summary

The authors consider minimum variance estimation within the Sparse Linear Gaussian Model (SLGM). A sparse vector is to be estimated from a linearly transformed version embedded in Gaussian noise. Their analysis is based on the theory of Reproducing Kernel Hilbert Spaces (RKHS). After a characterization of the RKHS associated with the SLGM, they derive novel lower bounds on the minimum variance achievable by estimators with a prescribed bias function. This includes the important case of unbiased estimation. The variance bounds are obtained via an orthogonal projection of the prescribed mean function onto a subspace of the RKHS associated with the SLGM. Furthermore, they specialize their bounds to compressed sensing measurement matrices and express them in terms of the restricted isometry and coherence parameters.

  • Format: PDF
  • Size: 510.22 KB