Quick and Dirty IIS Log Analyzer

04 Apr

There are nearly a million tools for making IIS logs useful. Log Parser 2.2 from Microsoft can combine all kinds of logs and even import them into SQL for further analysis. I’ve read instructions on how to use this command line tool but didn’t think I needed to go down that route for the problem I needed to solve. I was asked for a simple count of requests per day from a particular client IP. Also, there were some strange HTTP errors and I was asked if there was “anything in the logs”.

The application in question uses three separate IIS web servers behind a load balancer. The production environment gets about 300,000 hits or log entries a day. The test environment gets about 1.5 million with the automated scripts that run against it. That kind of traffic makes the logs too big to easily read from notepad.

You can setup IIS 6.0 to log directly to a database with ODBC but there is added overhead in this process. As for logging to a file, there are a few options. For starters, all of our servers are in the same time zone so I chose the Microsoft log format over W3C format which logs in GMT. The other option I had was how I wanted to roll the log. I kept the default of daily and chose the time of midnight. If you go into IIS manager and site properties you can see these log options.

So I needed an application to pickup yesterday’s log from three servers and combine them into one SQL database table. I could then run this with a scheduled task daily.

This kb was helpful in getting started with the W3C format: It showed me how easy a bulk import would be and also had the create table script. I started by copying one of the logs to the sql server and quickly realized my application would have to remove any header lines. The bulk import didn’t work if any data in the columns was outside the define parameters, eg: varchar(12) was actually 20 chars or an int was filled with text.

I ended up with a query that would create a new table for each day (if not exist) and do the bulk import from a file on the sql server’s root of C:. I found out it was important to have the status and winstatus columns be int type so I could easily search on them ( where status > 200 ). Also, the date and time columns had to be date and time type to search for particular ranges in the logs. The rest of the fields weren’t to important to me so they could be varchar(max) for now.

So my application does these steps:

1. create a yymmdd string for yesterday
2. pickup the thre “in(yymmdd).log” files
3. remove any title records
4. combine the files
5. send it to the sql server
6. if success, delete the .logs from iis server
7. create a .sql script file with the dates and file locations

Then the last step was create and execute a batch file with a scheduled task. The batch file runs my app and executes the .sql script.

That very quick and dirty has gotten me by for about 3 months without any change. A cleaner and better example would use SSIS and probably reporting services to let people view their own logs. Down the road I would to import the corresponding event logs and perfmon CPU data.

I feel happy with my decision for now. The alternatives were to wait for a purchase of a central logging solution, find a way to get notepad or notepadd++ to search the text files or tell the users I didn’t have the information they were looking for. Often times as an x-programmer I see a simple problem and just fix it. This can be bad if too many of these hard to support apps pop up but for now I have kept them to a minimum.

Leave a comment

Posted by on April 4, 2011 in Network Admin


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: