Content Relevance Prediction Algorithm in Web Crawlers to Enhance Web Search

Download Now
Provided by: International Journal of Advanced Research in Computer Engineering & Technology
Topic: Networking
Format: PDF
A Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner or in an orderly fashion. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches. An efficient web crawler algorithm is required so as to extract required information in less time and with highest accuracy. As the number of Internet users and the number of accessible Web pages grows, it is becoming increasingly difficult for users to find documents that are relevant to their particular needs.
Download Now

Find By Topic