Download now Free registration required
The authors show how semantic relationships that exist within an information-rich source can be exploited for achieving parsimonious communication between a pair of semantically-aware nodes that preserves quality of information. They extend the source coding theorem of classical information theory to encompass semantics in the source and show that by utilizing semantic relations between source symbols, higher rate of lossless compression may be achieved compared to traditional syntactic compression methods. They define the capacity of a semantic source as the mutual information between its models and syntactic messages, and show that it equals the average semantic entropy of its messages.
- Format: PDF
- Size: 417 KB