Speed Test Comparison: GTMetrix vs WebPagetest

Site speed and its impact on SEO has been one of the major topics in digital marketing for 2017 so far. Mobile continues to be the hot topic, so it’s no wonder that the art of “making stuff load fast for everyone”, regardless of device, has been such a talking point.

AMP (accelerated mobile pages) are a big step towards the point where we can rely on content to load near-instantly on our mobiles but a lot of sites are still struggling to load their content in a timely fashion.

At the same time, we now have better than ever free tools at our disposal for site speed analysis. Two of the most commonly used and well-regarded are GTMetrix, developed by GT.net originally as a tool for their hosting customers to determine the performance of their sites; and WebPagetest, an open source project that is primarily being developed and supported by Google, but was at first the brainchild of AOL, of all companies!

Both of these tools are that rare combination: excellent and free. I wanted to pit these two tools against each other and find out how different the results they’d give me for the same page could be. I also wanted to find out which one had the better feature set and which kicked out the most clear and helpful results in the end.

With that in mind I picked a well-trafficked, multi-media homepage that hosted 3rd party resources (and which will remain nameless) and, first making sure that both tests were running from the London UK test region on the same Chrome browser,  ran off some tests.



GTMetrix lays out your results in a very clean and clear way, with easily snip-able graphics that I find quite useful.

PageSpeed Insights mainly docked points for the following issues:

  • Numerous redirect chains
  • Lack of Keep-Alive (HTTP persistent connection) between the site and a major ad provider.
  • Lack of browser caching
  • Serving resources from an inconsistent URL
  • Lack of GZIP compression
  • HTTP request headers that were too big to fit in a single packet.
  • Under-optimised images
  • Lack of Last-Modified or ETag.
  • Under-use of asynchronous loading for external Javascript.

Yslow was similar, flagging up the major issues as:

  • Lack of expiration headers
  • Excessive use of external Javascript files
  • Numerous redirect chains

Each of these issues was broken down with an explanation of the issue and a list of the affected URLs/files. I found that this information was considerably more detailed than the information provided on Google’s own PageSpeed Insights results for the same page, oddly.

Useful tooltips were also provided for each area, explaining jargon and offering a link to a more detailed overview on a separate URL.

Issues were listed in order of impact upon the final score, which again was a useful way of prioritising fixes.

Additional information was available in the form of:

A full waterfall graph

A useful snapshot of the timings between connection and full load.

Once an account has been registered the site also offers page load videos and access to historical tests for comparison purposes.

Running the test on Google’s own page for the PageSpeed Insights tool returned the following:

A considerable difference to the 57% reported by GTMetrix. The issues reported were similar but considerably less detailed than the rundown I got from GTMetrix:

  • Optimize images
  • Leverage browser caching
  • Enable compression
  • Minify HTML
  • Minify JavaScript

Using the YSlow plugin directly also returned slightly different results, grading the site as a ‘D’ overall compared to the ‘E’ issue by GTMetrix.


Not quite as clean or clear an interface as GTMetrix, but lots of data to chew through.

Rather than the breakdown of issues, Webpagetest puts the waterfall front and centre so – helpful if that’s the main thing you’re looking for.

The details page gives a very detailed breakdown of every request made during the load process. There’s not much guidance available to help contextualise this, however.

The Performance Review tab offers a list of each requested resource, then tallies off whether it has been optimised by the following metrics:

  • Keep-Alive tag present
  • GZIP compression enabled
  • Image has been compressed (if image it be)
  • Use of Progressive JPEGs
  • Checks for use of caching
  • Checks whether it was served via a CDN.

This is followed by a full plain text report on any issues.

The Content Breakdown tab splits resources into their content type and displays them both in a chart and a pie graph. This can be useful for working out which resources are most often uncompressed or providing most of the strain on the server during load.

Further pie charts (full details are available in plain text or table format immediately underneath) break down the domains each resource was requested from and a breakdown to show which events and categories took up the most processing time during the load process.

The level of detail available on WebPagetest is impressive, but the way its displayed can be overwhelming. Fortunately it does offer robust export functions, allowing for the raw data to be pulled out in CSV format (where it can be battered into more legible shape) or, perhaps in an attempt to make this into a quick reporting feature, the HTTP pages making up the report can be exported in full.


I’m inclined to say that GTMetrix is going to be entirely adequate for the vast majority of cases – the ability to quickly produce a prioritised list of actionables is very alluring. That said, the extra detail available via WebPagetest is definitely going to sway some numbers. I was also surprised to see that YSlow and PageSpeed Insights both produced different results when I ran them independently – it may be that the best option out there is to use a combination of these tools, collate the info and attack it that way.

about the author: "Blueclaw's Senior Technical SEO likes canonical tags, URL parameters and long walks on the beach (alright, site migrations). Can typically be found tinkering with the innards of the nearest eCommerce site."