Comparative Analysis of Robot Detection Techniques on Web Server Log

Web robots are software programs which automatically traverse through hyperlink structure of Web to retrieve Web resources. Robots can be used for variety of tasks such as crawling and indexing information for search engines, offline browsing, shopping comparison and email collectors. Apart from that robots can also be used for some malicious purposes like sending spam mails, stealing business intelligence etc. It is necessary to detect robots due to privacy, security and performance of server related issues.

Resource Details

Provided by:
International Journal of Advanced Research in Computer and Communication Engineering (IJARCCE)
Enterprise Software