I’ve bloged about SQLIO a bit but never focused on it. SQLIO is a tool that creates disk activity and is useful when testing performance of a disk subsystem. When you run it you get some results like these:
C:UsersnujakDesktopSQLIO_old>sqlio -kW -t24 -s30 -o20 -fsequential -b4 -BH -LS -F1file1lun1th.txt sqlio v1.5.SG using system counter for latency timings, 2740654 counts per second parameter file used: 1file1lun1th.txt file C:testfile.dat with 1 thread (0) using mask 0×0 (0) 1 thread writing for 30 secs to file C:testfile.dat using 4KB sequential IOs enabling multiple I/Os per thread with 20 outstanding buffering set to use hardware disk cache (but not file cache) using specified size: 10240 MB for file: C:testfile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 24246.70 MBs/sec: 94.71 latency metrics: Min_Latency(ms): 0 Avg_Latency(ms): 0 Max_Latency(ms): 47 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 100 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
There are some important items in these results.
1. The date and time of the test (we’ll have to get that somewhere else
2. The command that was run, a trick is to properly name your parameters file so those are included in the command.
3. The IOPS
4. The MBps or bandwidth
5. The Avg Latency
The first thing you’ll want to do is make a good set of parameters files. Create small test files to test disk caching and large (200GB for some SANs) to test actual spindle activity. Datawarehouses often have a large number of LUNs that should be tested simultaneously. So have -F1file1lun1th.txt be a test and then -F1file12lun1th.txt be a test. You might want to test each LUN individually and also walk the way up to find the correct number of LUNs that max out your HBAs.
Make a batch file that has a series of SQLIO calls. They don’t have to be in any particular order because we will be sorting that out later. Then, is what you do is make another batch file that calls your first batch file and pipes the output to a text file. This text file will contain the above results over and over again for each test. To be scientific you should put a wait inbetween each call and also test each command at least three times.
Once that is done you setup your parsing program with … none other than my fav.. VB.NET.
Module Module1 Public CurrentDir As String = System.Environment.CurrentDirectory & "" Public OutputFile As String = CurrentDir & Format$(Date.Now, "yyyyMMddHHmmss") & "output.csv" Public TempDir As String = CurrentDir & "Results" Public dir_info As New IO.DirectoryInfo(TempDir) Sub Main() parseresults() End Sub Function parseresults() As Boolean Try Dim file_info_ar As IO.FileInfo() = dir_info.GetFiles() Dim file_info As IO.FileInfo Dim line As String Dim contents() As String Dim itemcounter As Integer = 0 Dim csvfile As String = "time,cmds,iops,bw,late" & System.Environment.NewLine For Each file_info In file_info_ar contents = System.IO.File.ReadAllLines(TempDir & file_info.Name) For Each line In contents If line.Contains("C:SQLIO>sqlio") Then csvfile = csvfile & file_info.Name & "," & line End If If line.Contains("IOs/sec:") Then csvfile = csvfile & "," & line.Substring(8, line.Length - 8) End If If line.Contains("MBs/sec:") Then csvfile = csvfile & "," & line.Substring(8, line.Length - 8) End If If line.Contains("Avg_Latency") Then csvfile = csvfile & "," & line.Substring(17, line.Length - 17) & System.Environment.NewLine End If Next Next System.IO.File.WriteAllText(OutputFile, csvfile) Catch ex As Exception System.IO.File.AppendAllText("exceptions.log", ex.ToString) Return False End Try Return True End Function End Module
I decided not to go directly to SQL with this because a .csv can be opened in Excel. Excel makes it very easy to graph the results. Also, csv’s pop into SQL quite easily if you want to query the data. SQLIO definitely has its place even though it doesn’t simulate real world activity. If you find your bandwidth limited to a solid 1Gbps or you can identify easier which piece of hardware to upgrade.