Date Added: Sep 2011
Interpolation of nonuniformly sampled signals in the presence of noise is a hard and deeply analyzed problem. On the one hand, classical approaches like the Wiener filter use the second order statistics of the signal, and hence its spectrum, as a priori knowledge for finding the solution. On the other hand, Support Vector Machines (SVM) with Gaussian and sinc Mercer kernels has been previously proposed for time series interpolation, with good properties in terms of regularization and sparseness. Hence, in this paper the authors propose to use SVM-based algorithms with kernels having their spectra adapted to the signal spectrum, and to analyze their suitability for nonuniform interpolation. For this purpose, they investigate the performance of the SVM with autocorrelation kernels for one-dimensional time series.