Microsoft

Consolidating events with free Log Parser 2.0 tool

Log Parser 2.0 is a command-line tool from Microsoft that lets you run SQL queries against almost any sort of Windows log file, and get the results to an array of destinations. The price is right: You can download it from the Windows Tools site for free.


By Mike Gunderloy

Sometimes, the challenge in software is not generating information, but making sense of it. Take this scenario that I ran across on a medium-size network: I collected all of the events from the event logs on a dozen or so Windows computers, and looked for interesting (or worrisome) patterns.

The Microsoft Operations Manager (MOM) tool is designed in part for this situation, but the system requirements for MOM are quite high, and it costs hundreds of dollars for every managed computer. Third-party solutions are quite expensive as well. The challenge in this case was to come up with a solution that wouldn’t break the client’s budget.

Log Parser 2.0
Fortunately, another Microsoft tool can do this particular job at a much more attractive cost: Log Parser 2.0, a command-line tool from Microsoft that lets you run SQL queries against almost any sort of Windows log file, and get the results out to an array of destinations, from SQL tables to CSV files. The price is certainly right: You can download it from the Windows Tools site for free.

Log Parser works as a near-universal tool for taking information from log files and putting it in other places. The download includes both a command line tool (which I’ll use in this article) and a COM object with the same capabilities. On the input side, Log Parser can deal with a wide variety of formats:
  • IIS log files
  • URLScan log files
  • HTTP error logs
  • Windows NT event logs and EVT backup log files
  • Generic comma-separated variable (CSV) or text files
  • Generic W3C files (such as personal firewall log files, Windows Media Services log files, and Exchange Tracking log files)
  • File and directory structure information

You also get a set of choices for the output of your queries. You can send results to the screen, or save them in W3C-formatted text files, IIS log format text files, SQL tables, XML files, CSV files, text files with user-defined templates, or native Log Parser files.

Consolidating events
For the job at hand, I wanted to use Windows NT event logs as the data source. For ease of analysis, I chose to send the results to a SQL Server table, since there was already a SQL Server available on the network. The first step was to create an appropriate table on the server. I set the table up by executing a SQL script in a new database named Events:
CREATE TABLE tblEvents (
      EventLog varchar(256),
      RecordNumber int,
      TimeGenerated datetime,
      EventID int,
      EventType int,
      EventTypeName varchar(256),
      EventCategory int,
      SourceName varchar(256),
      ComputerName varchar(256),
      Message varchar(256),
)


With the table in place, I could invoke Log Parser from the command line to send all of the events from the computer to the table:
LogParser "SELECT EventLog, RecordNumber,
TimeGenerated, EventID,
EventType, EventTypeName, EventCategory, SourceName,
ComputerName, Message
FROM System, Application, Security
TO tblEvents"
-o:SQL -server:BEARSPAW -driver:"SQL Server"
-database:Events -username:sa
-password:xxxxx -createtable:ON


There’s a lot to that command line, but if you look at it in pieces, it’s fairly obvious (and the Log Parser download includes full documentation of its SQL dialect for reference). The SELECT list refers to various pieces of information that can be extracted from an event log. Log Parser recognizes the special names System, Application, and Security in the FROM clause, so it knows to extract this information from the corresponding event logs. The TO clause specifies the destination: here, a SQL table, on a server named BEARSPAW. The TO clause also specifies which ODBC driver, database, and login information to use with the SQL Server.

This was fine for getting all of the information out of the event logs. But in reality, I wanted to update the information each night. Log Parser 2.0 doesn’t include the ability to do time-based selection in its SQL dialect (I’m told this is coming in version 2.1), which left me stuck for a while. But with a bit of experimentation, I hit on an undocumented WHERE clause that would give me events from the previous 24 hours:
LogParser "SELECT EventLog, RecordNumber,
TimeGenerated, EventID,
EventType, EventTypeName, EventCategory, SourceName,
ComputerName, Message,
TO_STRING(SUB(TO_LOCALTIME(SYSTEM_TIMESTAMP()),
TimeGenerated), 'MM-dd-yyyy') AS Delta
FROM System, Application, Security
TO tblEvents WHERE (Delta ='12-01-')"
-o:SQL -server:BEARSPAW -driver:"SQL Server"
-database:Events -username:sa
-password:xxxxx -createtable:ON


There’s no particular reason I can see that subtracting the current time from the event time and formatting it that way should yield a predictable result—but it does. The final step of the data collection process was to wrap up this command in a batch file, place it on each of the computers, and execute it at 2:00 A.M. each night using the Windows 2000 scheduled tasks facility.

Looking at the data
So much for generating the information. What about making sense of it? The nice thing about storing the information in SQL Server is that I can use the entire T-SQL language, as well as SQL Server Analysis Services, to explore the collected events. For example, I can get a list of the computers that are recording the most errors this way:
SELECT ComputerName, Count(RecordNumber)
FROM tblEvents
WHERE EventTypeName = 'Error event'
GROUP BY ComputerName
ORDER BY Count(RecordNumber) DESC


With a sample of about 60,000 events, my database yields a result set to this query that starts out:
DUCKLING   917
SUNBEAM    798
LOGCABIN   723
SKYROCKET  400
CROWFOOT   167
ANTFARM     47


There seems to be something up on the computer named DUCKLING. The next step is to look at the error events on that computer in a bit more detail, starting with the event sources:
SELECT SourceName, Count(RecordNumber)
FROM tblEvents
WHERE EventTypeName = 'Error event'
 AND ComputerName = 'Duckling'
GROUP BY SourceName
ORDER BY Count(RecordNumber) DESC


That returns this set of results:
Userenv                879
atapi                   21
FTPCtrs                  6
Schannel                 6
Service Control Manager  5


Going one step further, I can look at the actual error messages:
SELECT Message, Count(RecordNumber)
FROM tblEvents
WHERE EventTypeName = 'Error event'
 AND ComputerName = 'Duckling'
 AND SourceName = 'Userenv'
GROUP BY Message
ORDER BY Count(RecordNumber) DESC


And with that, I discover that 856 of those original 917 error events have the message “The Group Policy client-side extension Security was passed flags (17) and returned a failure status code of (1788).” Although the point of this article isn’t to diagnose Group Policy, a little poking around reveals KnowledgeBase articles Q279432 and Q245422, which will help you find the inconsistency in user accounts that lead to this error.

Digging for patterns
With the data in a SQL Server database, I can also use Analysis Services to look for trends and patterns. While a detailed look at Analysis Services is beyond the scope of this article, I can give you a sense of what it does. The basic idea of Analysis Services is to make it possible to aggregate data in many different ways, without a lot of processing overhead. This is done by transforming the data from a database into an OLAP (Online Analytical Processing) cube. A cube is similar to the grouping queries that I used above, but instead of grouping on just one column, it groups on many columns at once.

Figure A


Figure A shows a cube built from my 60,000 rows of sample event log data, open in the Cube Browser that’s supplied by Analysis Services. In this case, I’m looking at information events only (selected in the combo box above the grid) and investigating the connections between the source of the events and the month that the events were recorded. This provides an easy way to look for trends. With simple mouse clicks and combo box selections, you can drill into the data in almost any way imaginable.

More uses for Log Parser
The combination of Log Parser and SQL Server with Analysis Services can help any time you have a large set of log files and want to ferret out a few interesting facts. Here are a few more scenarios to get you thinking:
  • Group and filter a URLscan log to determine which attacks are being executed against your Web servers.
  • List the top 20 pages on your Web server, or the top 20 errors.
  • Monitor for attempted break-ins by looking for failed logins in your event logs.
  • Create an Excel report by transforming your IIS logs into CSV files (which Excel can open directly).

Once you start working with it, you’ll discover that Log Parser 2.0 offers a fast, flexible, and free way to get a handle on large amounts of log data. And that’s another service you can offer your clients to improve your usefulness.

 

Editor's Picks