Here’s a bash script that you can use to parse multiple MySQL Slow Query Log files, in one sweep, into something much more understandable. The script uses the handy utility mysqlsla so make sure this is in your path. 

mysqlsla parses, filters, analyzes and sorts MySQL slow, general, binary and microslow patched logs in order to create a customizable report of the queries and their meta-property values. Since these reports are customizable, they can be used for human consumption or be fed into other scripts to further analyze the queries. For example, to profile with mk-query-profiler (a script from Baron Schwartz’s Maatkit) every unique SELECT statement using database foo from a slow log: source

Place all your slow logs into a directory. Change the sl_dir variable to point at this directory. When you execute the script it will create a directory, within your slow logs directory, called reports. This will contain the reports produced by mysqlsla.

#!/bin/bash

# Script to process multiple mysql slow logs
# using mysqlsla http://hackmysql.com/mysqlsla

# Directory containing slow logs
sl_dir="/home/rhys/Desktop/slow_logs";

cd "$sl_dir";
#slow_logs=$(ls "$sl_dir");

# Folder for reports
if [! -d "$sl_dir"/reports]; then
                mkdir "$sl_dir"/reports;
fi

# process each slow log file
for file in "$sl_dir"/*
do
                echo "Processing file: $file";
                filename=$(basename "$file")
                mysqlsla -lt slow "$file" > "reports/$filename.rpt";
                echo "Finished processing file: $file";
done

The reports produced are much easier to work with than the raw mysql logs so this should be a good time saver when optimising those queries!