Information Theoretic Limits On Learning Stochastic Differential Equations

Consider the problem of learning the drift coefficient of a stochastic differential equation from a sample path. In this paper, the authors assume that the drift is parametrized by a high-dimensional vector. They address the question of how long the system needs to be observed in order to learn this vector of parameters. They prove a general lower bound on this time complexity by using a characterization of mutual information as time integral of conditional variance, due to Kadota, Zakai, and Ziv. This general lower bound is applied to specific classes of linear and non-linear stochastic differential equations. In the linear case, the problem under consideration is the one of learning a matrix of interaction coefficients.

Provided by: Stanford University Topic: Big Data Date Added: Mar 2011 Format: PDF

Find By Topic