The Heisenbot Uncertainty Problem: Challenges in Separating Bots From Chaff
Source: University of California
In this paper the authors highlight a number of challenges that arise in using crawling to measure the size, topology, and dynamism of distributed botnets. These challenges include traffic due to unrelated applications, address aliasing, and other active participants on the network such as poisoners. Based upon their experience developing a crawler for the Storm botnet, the authors describe each of the issues that encountered in practice, their approach for managing the underlying ambiguity, and the kind of errors they believe it introduces into their estimates. Underlying virtually all measurement endeavors is the premise that the signal being measured can be separated from any noise produced by the environment or the measurement system itself.