Networking

Stochastic Modeling of a Single TCP/IP Session Over a Random Loss Channel

Download Now Free registration required

Executive Summary

In this paper the authors present an analytical framework for modeling the performance of a single TCP session in the presence of random packet loss. This framework may be applicable to communications channels that cause random packet loss modeled by appropriate statistics of the inter-loss duration. It is shown that the analytical model predicts the throughput for LANs/WANs (Low and high bandwidth-delay products) with reasonable accuracy as measured against the throughput obtained by simulation. Random loss is found to severely affect the network throughput higher speed channels are found to be more vulnerable to random loss than slower channels especially for moderate to high loss rates.

  • Format: PDF
  • Size: 374.16 KB