Minimum Variance Estimation of a Sparse Vector Within the Linear Gaussian Model: An RKHS Approach

The authors consider minimum variance estimation within the Sparse Linear Gaussian Model (SLGM). A sparse vector is to be estimated from a linearly transformed version embedded in Gaussian noise. Their analysis is based on the theory of Reproducing Kernel Hilbert Spaces (RKHS). After a characterization of the RKHS associated with the SLGM, they derive novel lower bounds on the minimum variance achievable by estimators with a prescribed bias function. This includes the important case of unbiased estimation. The variance bounds are obtained via an orthogonal projection of the prescribed mean function onto a subspace of the RKHS associated with the SLGM. Furthermore, they specialize their bounds to compressed sensing measurement matrices and express them in terms of the restricted isometry and coherence parameters.

Provided by: Vienna University of Economics and Business Topic: Mobility Date Added: Apr 2013 Format: PDF

Find By Topic