The Role of Subspace Swap in Maximum Likelihood Estimation Performance Breakdown
Maximum likelihood estimation techniques demonstrate "Performance breakdown" at low signal-to-noise ratios where observed estimation errors rapidly depart from the Cramer-Rao bound below a threshold SNR. Rather than rely on the classic asymptotic analysis for prediction of that threshold, Random Matrix Theory (RMT) analysis is employed. Both analytic predictions and direct Monte-Carlo simulations demonstrate that the threshold value can be reliably predicted even for small sample support far removed from classic asymptotic assumptions. It has been known for a long time that under certain "Threshold" conditions, MLE may experience "Performance breakdown" and generate severely erroneous estimates ("Outliers") not consistent with the CRB predictions.