[This content was originally published in the Digital Echidna Blog on October 31, 2019]
A slow site has negative impacts on search engine optimization, can increase bounce rate, and lower site conversions. Taking a surgical approach and understanding the cause of the problems can help you effectively ‘treat’ the site and alleviate the symptoms.
*This blog is part three in Martin Anderson-Clutz's ongoing series devoted to site speed performance issues.
On-site assets (i.e. images, fonts, JS and CSS files) are the most common cause of long page load times. If you’ve identified part of your speed problem is related to on-site assets, it’s a good idea to narrow down where specifically there are issues to address. To do this, you must audit your site.
Here are two Google products you can use to run audits:
Both use the same analysis engine so you will see similar results from both. If you are tuning performance for SEO, use Pagespeed Insights because it runs on Google servers and will be a better reflection of the potential SEO impact. That said, being able to run Lighthouse tests locally can be invaluable especially during development to catch any issues prior to putting a site into production.
In addition to the provided recommendations, the audit results will often include an estimate of the potential time savings and an estimate of the impact per asset, which will help you prioritize your time and correct issues with files creating the most overhead.
If it is CSS or JS that is slowing your site down, out-of-the-box Drupal has the ability to aggregate these two files (CS and JS) in order to minimize the number of requests to the server and negate its drag on page load time.
That said, if your hosting environment is using HTTP/2, it may not benefit from aggregation. On the other hand, adding a contrib module like Advanced Aggregation can help to minify and compress your CSS and JS files for faster delivery, using gzip, or even the newer Brotli compression algorithm.
Tip: Here, as with our discussion about modules in the last instalment, less is more. If you’re thinking about adding a library just to give your site all the latest interactive ‘hotness’, consider first if that extra functionality outweighs the performance impact of the extra requests and page weight.
Library vs Custom Code
Sometimes the functionality you really need can be accomplished with a few lines of custom code, instead of loading a larger, more robust library, that, while providing more functionality, will take longer to load and may have its own dependencies -- for example on a specific version of jQuery.
You can provide tabs functionality by using jQuery UI, but a full set of the jQuery UI code is over 500kb. You can be selective and use just the jQuery UI code for tabs, but that will still be over 23kb. Custom code can achieve basic tabs functionality with 1kb or less. But, if you must add a library to your site, try to only add it where it’s really needed. In Drupal 8 you can define a library using a *.libraries.yml file and then attach it where needed using the “#attached” key within a preprocess or hook_element_info_alter, or by using Drupal’s custom Twig function attach_library within your template.
Tip: If you want to add a library to allow users to view the contents as an interactive table, then add the library in a way where it’s only loaded on the pages that would use it.
Bootstrap and CSS
If you're using Bootstrap, there is a good chance your audit will report a large amount of CSS being delivered. Bootstrap comes with a massive amount of code and is designed to have unnecessary parts stripped away. When we run audits we often see sites where the original build left the Bootstrap code intact and only added to it, resulting in a massive amount of CSS being loaded (unnecessarily) on every page.
Tip: Wherever you can, strip away code on your site that isn’t being used. And if you’re sure a framework is the best approach for your site, try to use one that’s lightweight.
Web sites continue to get more and more visual, as people follow the accepted wisdom that people don’t read on the web (hopefully this page is an exception! ha). The increasing use of images can cause performance issues -- especially if there isn’t careful work to optimize them. Your audit will tell you where and what images could be optimized to decrease page load time.
To understand when to use each type of web format, we need to first distinguish between graphic images characterized by large areas of solid colours, as often in a logo or chart, and photographic images with continuous tones and variations in the amount of detail.
Let’s compare some popular image formats for logos and charts:
JPEG - 13.0k
GIF - 12.3k
WebP - 25.9k
PNG (RGB) - 15.4k
PNG (8 bit) - 8.7k
SVG - 4.7k
From these examples we can see a few things about the suitability of these formats for graphic images:
- SVGs are the smallest, can be scaled up or down to any size, and will maintain clean edges. That said, you typically need to start with vector artwork, or you’ll have to redraw the artwork, which is time-consuming.
- JPEG compression is often used for graphic images (especially if transparency isn’t required), but because it’s a lossy format it can create compression artifacts. For our example, we used a 70% quality index. A lower index rating can produce a smaller file, but with more visible artifacts. Typically we would recommend against using JPEG for graphics.
- 8-bit PNGs are a typical go-to for graphics like logos and charts, especially if vector source files aren’t available. For more complex graphics (especially where banding or dithering is visible) it would be worth comparing the result and size with an RGB PNG (which is bigger because it uses a larger colour palette) or a WebP image (which is Google’s attempt at a blend of PNG and JPEG, and we’ll discuss in more detail below).
And now to compare the most popular choices for photos:
GIF - 86.4k
PNG RGB 349.9k
not supported by Safari
For photographic images, JPEG images are your best bet. Some sites aggressively compress their images, claiming that average users can’t see the artifacts produced by a 40% quality index. Even a 60% quality index will produce much smaller file sizes, with a visual result that is difficult to distinguish from the original.
WebP is a newer image format produced by Google, with the intent to combine the best aspects of both the PNG and JPEG formats. Because of this heritage, it was initially only supported by Chrome. Firefox and other browsers have since come on board, but Safari is still a notable holdout (including on iOS) so current estimates indicate that just under 80% of web users can view WebP images. There is a WebP module available to generate image derivatives in this format, but it also requires some changes to the htacess file, to ensure WebP files will be served to browsers that support them.
Much of the research on WebP suggest that typical file size savings are in the 20 to 30 per cent range, so the savings in our test example shouldn’t be expected for every image, but might be an extreme example of the savings that are possible.
Tip: Not every image falls squarely into one usage category. Choose what is best for your desired output.
Now that we have gone over the different types of images, let's talk optimization. There are a couple of ways to achieve this. My favourite involves using the Image Optimize module in Drupal.
If you’re able to install programs in your hosting environment, you can install one or more toolkits and then define a pipeline so that Drupal Image API will use the toolkits when generating a derivative image.
Check out the Image Optimize Binaries module for a list of the supported binaries. I’ve seen some research indicating potential optimization savings of over 60 per cent, but in my testing the results were typically in the order of seven to 10 per cent. Still worth doing, but in my experience unlikely to make a huge difference.
If you’re using a host on which running one of the above toolkits locally will be a problem, the good news is that there’s a free online service you can use.
Tip: Use the Image Optimize reSmush.it module to pipe your site’s images through the reSmush.it service to be optimized on the fly. The downside here is that you may notice a slight delay the first time an image is processed this way, but once created it will load full speed, in all its optimized glory.
Another common issue that slows down sites, is using images at a much bigger size than they need to be. Drupal core provides the ability to define Image Styles, so images can automatically be resized and even cropped to a specific proportion so that the image downloaded is the same size as the image displayed. Any extra and you’re forcing your visitors to wait for pixels to download that they’ll never see.
Now that people access websites on such a broad range of devices, the challenge with properly sizing images is having them optimized for each device. You want your site to look gorgeous on one of the increasingly popular large desktop screens, but also to have it load quickly on mobile devices. Fortunately, Drupal 8 has the answer in core: responsive images.
Within your theme or custom module, you can define your own custom set of breakpoints (or use the default set provided by the Responsive Image module). Then creating a responsive image style is as simple as telling Drupal what image style to use at different breakpoints.
Finally, set your display configuration to output your image as a responsive image, and choose the responsive image style you created. You’ll now be ready to optimize the delivery of your image for each kind of device.
One other powerful way to accelerate the use of images on your site is to “lazy load” them. Instead of loading them all with the page initially, this approach only loads images as they come into view.
A slight delay as you scroll often translates into big savings in the initial load time, and you’re much more likely to keep people on your site. The excellent Blazy module makes this easy to implement, and can even be used to lazyload responsive images.
Content Delivery Network
We’ve talked about how to optimize the on-site assets themselves, but there’s also another way to these files delivered faster: use a better network.
A Content Delivery Network (CDN) maintains its own network of geographically diverse points of presence (PoP). When you use a CDN, a request from a visitor to your site gets routed to their nearest PoP, so they can download the file from a server much closer than your hosting environment, and all the incoming requests across the globe are distributed across a purpose-built, highly performant set of servers, instead of bogging down the single server (or virtual cloud host) you have for hosting.
The good news is that CDN adoption is becoming increasingly widespread, so popular Drupal hosts like Acquia and Pantheon include free and zero-configuration CDN delivery as part of some plans. Cloudflare, a prominent CDN service, even has a free plan available, and there are low-cost options available from competitors like KeyCDN and Google Cloud CDN. Some CDN providers can also provide automatic optimization of images and other assets, and even mobile optimization, though often as part of a premium plan.
Hopefully this installment gives you everything you need to diagnose and resolve performance issues related to your on-site assets. In the next installment, we’ll talk about working on issues with third-party assets and services.
Then we’ll wrap up our performance series by talking about some implementation considerations and how you can make sure performance is kept as a primary goal throughout development and post-launch planning.