Design and Implementation of a High-Performance Distributed Web Crawler

Broad web search engines as well as many more specialized search tools rely on web crawlers to acquire large collections of pages for indexing and analysis. Such a web crawler may interact with millions of hosts over a period of weeks or months, and thus issues of robustness, flexibility, and manageability are of major importance. In addition, I/O performance, network resources, and OS limits must be taken into account in order to achieve high performance at a reasonable cost. In this paper, the authors describe the design and implementation of a distributed web crawler that runs on a network of workstations.

Provided by: Polytechnic University Topic: Networking Date Added: Jan 2011 Format: PDF

Find By Topic