Automated performance testing for core - BOF/discussion notes

Events happening in the community are now at Drupal community events on www.drupal.org.
You are viewing a wiki page. You are welcome to join the group and then edit it. Be bold!

This is a summary of some of the discussions around automated performance testing of core at DrupalCon Chicago. It's incomplete, but this is a wiki so please help fill in the gaps.

Test runs

Rather than trying to integrate this with PIFR, we want to run tests against git branches triggered by git pushes. Initially this would allow for post-commit/push testing of HEAD, which would only find regressions after the fact, but from there we can extend this to forks of HEAD in sandboxes, per-issue branches etc.
A comprehensive outline of what this would require is drafted up here: http://drupal.org/node/638078

Measurement

To avoid issues with hardware inconsistencies Damien had the idea of using control groups http://www.kernel.org/doc/Documentation/cgroups/. These allow you to assign resources to groups of processes, audits the usage, and has a reporting framework - so we could measure CPU, memory, disk, (potentially network traffic) for PHP and the database separately - and reliably across different hardware (at least much more reliable than ab, jmeter or any equivalents that are measuring wall time).

We could also profile the test runs with xhprof (http://pecl.php.net/package/xhprof), then aggregate and diff results across and between test runs - Narayan found a project that wraps some of this. xhprof also measures CPU time as well as wall time, so again this should be mostly reliable in different enviornments (at least a lot more reliable than xdebug or microtime).

Test plans

Since we are trying to measure application performance rather than actually load testing hardware, and to have test plans that can be contributed to and maintained by as many people as possible, we’d like to use one of the more recent browser APIs (selenium, watir, or one of the JavaScript ones like zombie.js) - to have clients that can browse pages, submit forms etc.

Doing it this way also gives us a start towards doing automated front end testing (actual page load measurements, resource load/parse times, Javascript execution timing etc.) along the lines of browsermob's offering. The actual collection of measurements could be accomplished through browser plugins (for example Page Speed can be set to dump measurements into a file) or by adding Javascript to the page that collects measurements (such as the episodes module) and submits them to a collector service at the end of the page load.

The basic browser infrastructure could probably be reused for doing JavaScript unit and functional tests.

As well as full page requests, we’d also start a contrib project with some microbenchmark PHP scripts (i.e. different steps of the bootstrap, render a node to json, check_plain() some different strings etc.) and then test plans could request these to generate reporting for that kind of thing as well.

Timeline

We’d like to get something up and running as soon as possible - since anything is better than nothing and the more is in place as Drupal 8 picks up speed the better - we can then improve both the integration, reporting and test cases over time.

AttachmentSize
perftest.png26.91 KB

Comments

Memory Usage/Leak Testing

mikeytown2's picture

I think this could be done in a fairly consistent way. In this thread is some test code I made for finding memory leaks in views
http://drupal.org/node/988680

Unlike CPU time, memory usage should be more consistent.

High performance

Group notifications

This group offers an RSS feed. Or subscribe to these personalized, sitewide feeds:

Hot content this week