Recently at work, my team was asked to help gather data about database server performance before and after an upgrade. To help with this, we collected a number of heavy database pages on some WordPress sites, dumped every query running to generate the page, and grabbed them to profile.
I whipped up this quick and dirty WP-CLI command that will run a list of SQL queries 1,000 times and give you the minimum time, maximum time, average time, and standard deviation for each query.
With this data, you can re-run the same queries again after a server change to determine if there has been any major SQL performance differences:
/**
* Profiles DB performance by running SQL queries and returning timing statistics.
*/
public function db_profile( $args, $assoc_args ) {
$format = WP_CLI\Utils\get_flag_value( $assoc_args, 'format', 'table' );
// Put all of your queries here!
$site_sql_lines = <<<END
SHOW TABLES LIKE 'wp_posts';
END;
$lines = explode( PHP_EOL, $site_sql_lines );
global $wpdb;
$stats = array();
$total_time_us = 0;
$runs = 100;
$progress = \WP_CLI\Utils\make_progress_bar( sprintf( 'Running %s Queries', number_format( count( $lines ) * $runs ) ), count( $lines ) * $runs );
for ( $run = 1; $run <= $runs; $run++ ) {
foreach ( $lines as $index => $line ) {
if ( ! isset( $stats[ $index ] ) || ! is_array( $stats[ $index ] ) ) {
$stats[ $index ] = array(
'query' => $line,
'runs' => array(),
);
}
$start_time = hrtime( true );
$results = count( $wpdb->get_results( $line, ARRAY_N ) ); // phpcs:ignore WordPress.DB.DirectDatabaseQuery, WordPress.DB.PreparedSQL.NotPrepared
$end_time = hrtime( true );
$time_us = ( $end_time - $start_time ) / 1000;
$total_time_us += $time_us;
$stats[ $index ]['runs'][ $run ] = $time_us;
$progress->tick();
}
}
$data = array();
foreach ( $stats as $index => $stat ) {
$stats[ $index ]['stats'] = array(
'min' => min( $stats[ $index ]['runs'] ),
'max' => max( $stats[ $index ]['runs'] ),
'avg' => array_sum( $stats[ $index ]['runs'] ) / count( $stats[ $index ]['runs'] ),
);
$data[ $index ] = array(
'Query' => $stat['query'],
'Min' => number_format( (float) $stats[ $index ]['stats']['min'] / 1000, 3 ),
'Max' => number_format( (float) $stats[ $index ]['stats']['max'] / 1000, 3 ),
'Avg' => number_format( (float) $stats[ $index ]['stats']['avg'] / 1000, 3 ),
'StdDev' => number_format( (float) $this->stats_standard_deviation( $stats[ $index ]['runs'] ) / 1000, 3 ),
);
if ( 'csv' === $format ) {
$data[ $index ]['Runs'] = wp_json_encode( $stats[ $index ]['runs'] );
}
}
$progress->finish();
// Output data.
WP_CLI\Utils\format_items( $format, $data, array_keys( $data[0] ) );
// Output totals if we're not piping somewhere.
if ( ! WP_CLI\Utils\isPiped() ) {
WP_CLI::success(
sprintf(
'Total queries ran: %s, DB Time Taken: %s',
WP_CLI::colorize( '%g' . number_format( count( $lines ) * $runs, 0 ) . '%n' ),
WP_CLI::colorize( '%g' . $this->convert_to_human_readable( $total_time_us ) . '%n' ),
)
);
}
}
Code language: PHP (php)
You can then run it and gather the data as a table, or as a CSV file (--format=csv
) which is much more likely to be useful.
Leave a Reply