Download now Free registration required
In sensor management, the usefulness of information theoretic measures seems to be validated by a large number of empirical studies, but theoretical justification presented until so far, both for selection of the measure and for the use of information-driven sensor management itself, still seems inconclusive, conflicting, or debatable. In this paper, the authors suggest that information-driven sensor management may be justified on the basis of uncertainty reduction rather than information gain. They subsequently identify that, due to well-known relationships between Shannon entropy, mutual information and Kullback-Leibler (KL) divergence, for sensor management purposes using the Kullback-Leibler (KL) divergence (a measure of information gain; thus a relative measure) is exactly the same as using the Shannon entropy (a measure of uncertainty; an absolute measure).
- Format: PDF
- Size: 185.86 KB