Generic Log Analyzer Using Hadoop Mapreduce Framework

Provided by: International Journal of Emerging Technology and Advanced Engineering (IJETAE)
Topic: Data Management
Format: PDF
There are various applications which have a huge database. All databases maintain log files that keep records of database changes. This can include tracking various user events. Apache Hadoop can be used for log processing at scale. Log files have become a standard part of large applications and are essential in operating systems, computer networks and distributed systems. Log files are often the only way to identify and locate an error in software, because log file analysis is not affected by any time-based issues known as probe effect. This is opposite to analysis of a running program, when the analytical process can interfere with time-critical or resource critical conditions within the analyzed program.

Find By Topic