Joomla Performance Tuning II: Basic Settings
In the first part of this series I described why tuning the performance of your site is something you should do for both philosophical and practical reasons, as well as where to start. That post was by necessity a bit generic. In the second part of this series we'll dive into some of the basic things you can do in Joomla to unlock a decent amount of performance.
Basic system settings
When building a site we get so caught up in the design and functionality that we forget that some very basic and fairly straightforward system settings can have a massive impact on the performance of our sites. A few simple switches in the Global Configuration before delivering the site and a few simple server checks can make all the difference in the world.
Most of the time spent on the server side of a site has to do with constructing the page that will be displayed to the visitor. Joomla, unlike WordPress, has a built-in caching system. I feel that people don't give it enough credit because they were used to the subpar caching experience in Joomla 1.0 and 1.5. That was 10 to 15 years ago.
Go to your site's Global Configuration and set caching to "ON - Progressive Caching". The progressive caching option is the better implementation of Joomla built-in caching, making sure that the output of each extension used to construct your page is individually cached. When a request comes through, the page is stitched together from those pieces of cached content whenever possible. This can even work towards mitigating some performance lost to a pre–built, non–optimized template. It will definitely work towards making your public, non–logged-in pages faster — exactly what is most relevant for your site's search engine ranking.
As to the caching back-end, most sites can get away with using the file caching which is similar in performance to memcached or Redis running on a decent, commercial, shared or virtualised, host — with much less memory use, therefore much cheaper to run. “Heresy!” the more technically minded between you cry. I agree with your sentiment, to an extent. If you have a truly massive or extremely busy site it makes sense to use a dedicated memcached or Redis server as your caching backend. It will be faster. Chances are that if you're reading this you do not, in fact, have this kind of site, and you're looking at speeding up a much more run–of–the–mill site. Even my own business site falls into this category, despite the fact that we're getting traffic in the order of hundreds of dozens of thousands of unique monthly visitors. That should give you a sense of the site scale that would benefit from caching using a dedicated caching server.
If there was a contest for the most-overlooked option in Joomla, Gzip Page Compression would win hands down. If you haven't already done so, go ahead and enable it.
This option makes sure that the HTML content sent by your site to the browser is compressed using the GZip (also called "deflate") algorithm. This reduces the total size of the data transferred to the client substantially. The amount of time saved in data transfer has a significant impact to your site's performance.
Does it not slow down the site? No, not really. The HTML pages generated by Joomla are in the few dozens of kilobytes size range. It takes a fractional millisecond to compress to around a third to a half of that size. With the typical transfer speeds between a host and your visitors this translates to a couple milliseconds of time gained. You gain two to three orders of magnitude more time than you lose in this case.
The reason for this counter–intuitive result requires talking about how servers deliver static and dynamic content. A properly set up web server caches often used static content in memory. Moreover, it uses advanced features of the Operating System such as memory mapping of files. These result in very speedy delivery of static content.
When you are using a PHP script to compress your static files, the web server has to hand over the request to the PHP executable. In the best case scenario (PHP FastCGI Process Manager a.k.a. PHP-FPM, with a big enough pool of processes and PHP OPcache enabled) this still wastes some time doing inter–process communication and resetting the PHP parser's state. The script needs to be confirmed as unchanged, its precompiled binary representation loaded and interpreted, executed by the PHP binary, the static file needs to be opened, its contents compressed and sent to Apache to deliver to the client. All of that takes dozens of milliseconds. Unless you are compressing a file well over several hundred kilobytes big the amount of time you lost is far greater — by one or two orders of magnitude! — to the amount of time gained by delivering a smaller, compressed file. Therefore, it's a net loss.
I strongly recommend doing this through your web server itself. If you are using Apache you can add the following to your .htaccess file:
Your web server is much faster in compressing static media files. It can keep the compressed files cached in memory for speedier delivery next time around.
media/com_example/js/something.min.js. Compress it with GZip into
media/com_example/js/something.min.js.gz. When a browser requests the file
media/com_example/js/something.min.js the web server will check its
Accepts HTTP header to see if it supports GZip–compressed resources. If it does, it will deliver the
media/com_example/js/something.min.js.gz file instead of the regular, uncompressed
The prerequisite to that is that you rename the
htaccess.txt file shipped with Joomla to
.htaccess. Alternatively, if you are managing your own .htaccess file, make sure that the following code is included in your file:
Static media caching
Reducing the size of the static media with compression is half the battle and matters primarily for first time visitors. When someone comes back to your site it makes sense for their browser to serve static media from the browser's cache, without hitting the network at all. If you're using Apache you can use the following code in your .htaccess file:
<link href="/media/plg_system_webauthn/css/button.min.css?f15d039055248502c1a41bc99a31c0f3" rel="stylesheet">
?20211205111300 (a question mark followed by the year, month, day, hour, minutes and seconds — the time you wrote that query) is more than adequate.
HTTPS and HSTS
There's a common misconception that HTTPS has something to do with securing your site, it's expensive, it's slow, and you don't really need it unless you're doing e-commerce or something. Another misconception is that it makes your site slower.
These myths originated in the late 1990s. Over two decades ago they are patently false.
HTTPS is pretty much mandatory these days. If you don't use HTTPS your site will appear with a big, red warning telling your visitors it is insecure, scaring visitors away. It will be penalized by search engines. You should use HTTPS if only to fix these two problems. You don't even need to break the piggy bank. TLS certificates are now free of charge thanks to Let's Encrypt. Most hosting control panels integrate with Let's Encrypt, meaning that you can literally have your hosting control panel issue and install a free TLS certificate and auto-renew it. There is zero maintenance on your part. HTTPS is also super–fast since any modern CPU, released over the past ten-odd years, has hardware acceleration for the cryptographic operations it uses.
While you're at it, remember to set “Force HTTPS to Entire Site” in your Global Configuration. This ensures that your Joomla site will always be delivered over HTTPS, making logins more secure in the process. Once you do that, and you've confirmed HTTPS works great with your site, add the following to your .htaccess:
<IfModule mod_headers.c> Header always set Strict-Transport-Security "max-age=31536000" env=HTTPS </IfModule>
This enables a feature called HSTS (HTTP Strict Transport Security). In short, it tells your browser to never even try to connect to the HTTP version of your site, regardless of what your visitor tells it to do. Since this happens on the browser side a visitor who types your domain name in the address bar without the
https:// prefix, or clicks a link with the
http:// prefix, will always get to the HTTPS version of your site without having to first visit the plain HTTP version and get redirected by Joomla. This is much faster, especially on high-latency connections such as mobile or satellite Internet.
A further optimization you can do is submit your site to the HSTS Preload List. While HSTS only works after the first time someone visits your site, having your site in the HSTS Preload List means that the browser knows about your site using HSTS before the first time your visitor visits it. Therefore, the browser will never attempt to load it over plain HTTP. Again, this is a time saver for high-latency connections, easy and free. What's not to love about it?
HTTP/2 Server Push
When talking about making Joomla faster in the past I used to tell people how to enable HTTP/2 Server Push to make sites faster. However, the Google Chrome developers have already proposed removing support for it and stated that it will anyway not be implemented for the HTTP/3 protocol at all. Therefore, my current advice is to not even bother with it.
To be continued
This is part two of a five–part series. Part III: Static media optimisation will be made available in the January 2022 issue of the Joomla Community magazine.
German translation of this article: https://www.jug-zueri.ch/artikel/performance-tuning-in-joomla-teil-2-grundeinstellungen
Spanish translation of this article: https://mejorconjoomla.com/noticias/magazine/puesta-a-punto-del-rendimiento-de-joomla-4-ajustes-basicos
© Copyright ©2020–2021 Nikolaos Dionysopoulos. All legal rights reserved.
Not sure if anyone reads a comment on an almost year old article, but you never know.
Recently, I came across information stating that compression of dynamically generated content is unsafe and vulnerable to CRIME and BREACH attacks. I guess this vulnerability only applies to pages containing sensitive content, but still I'm curious if Nicholas has ideas about this, in relation to the passage on "HTML compression"?
No, I would not turn off compression at the server level to "mitigate" a problem which has been solved at the TLS protocol implementation ten years ago and definitely in TLS 1.3 (the current TLS version since 2018).
If you want to be paranoid, did you know that there are proven attacks and papers published about analysing keystrokes from deauthing Bluetooth keyboards, measuring the EM fields prom unshielded (i.e. most if not practically all) USB keyboard cables, analysing typing sounds or using infrared photography to analyse the heat map of keys (apparently ABS keys used in more keyboards are far easier to analyse for thermal mapping than PBT keycaps)? Most of the same apply for mouse input. The reflection of your screen on your glasses when having an on-line meeting can be analysed to get a view of your screen at roughly 600x400 pixels resolution. The sub-mW variation of your computer's power consumption can be used to infer what you are doing or to exfiltrate data from air gapped computers. If you consider all possible attacks against a computer you will arrive to the conclusion made as a quip by a former Director of the CIA: a computer is secure when it's shredded, melted, the slug made to dust and buried 6 feet under in an undisclosed location. But we're not all using our computers to handle Top Secret material, the disclosure of which might be detrimental to our nation's security or result in a thermonuclear war that wipes out humanity. So let's not join the tinfoil hat brigade by overreacting to what is possible versus what is plausible.
Speaking of possibility versus plausibility, let's talk about the first rule of security: update everything, yesterday. All of these attacks are well-known and already mitigated in server software and common software libraries. As long as you use an up-to-date server environment which uses currently maintained software at their latest patch level you can safely use compression and you are fairly secure in the sense that you have been protected against known attack vectors. Sure, you may have a security issue nobody has discovered yet or at least one that's not been publicised yet, but there's sod all you can do about it beyond standard security practices which are mostly useful to help you do a post-mortem after a successful attack. As long as your clients are using an up-to-date browser on a maintained and up-to-date Operating System without insecure / compromised browser extensions they will be safe on their end as well. If either end is still using out-of-date software one or both parties will eventually have a very bad day.
Thanks for your quick and comprehensive reply. Very much appreciated. To be honest, I didn't even know about this alleged security issue, until a colleague brought it up a few days ago. The assumption was it could be bad for search engine ranking, of which I haven't been able to find proof as of yet. At least I know what to say if anyone else ever brings it up again, although it will be quite difficult to reproduce this elaborate report by heart
The simplest answer you can give to your colleague is this: “This was a problem in 2012 and has been fixed since then — that's over ten years ago. If our servers are running software which is ten or more years out of date, this would be the least of our worries”.