13 minutes reading time (2554 words)

Joomla Performance Tuning II: Basic Settings

December-J4Performance

In the first part of this series I described why tuning the performance of your site is something you should do for both philosophical and practical reasons, as well as where to start. That post was by necessity a bit generic. In the second part of this series we'll dive into some of the basic things you can do in Joomla to unlock a decent amount of performance.

Basic system settings

When building a site we get so caught up in the design and functionality that we forget that some very basic and fairly straightforward system settings can have a massive impact on the performance of our sites. A few simple switches in the Global Configuration before delivering the site and a few simple server checks can make all the difference in the world.

Caching

Most of the time spent on the server side of a site has to do with constructing the page that will be displayed to the visitor. Joomla, unlike WordPress, has a built-in caching system. I feel that people don't give it enough credit because they were used to the subpar caching experience in Joomla 1.0 and 1.5. That was 10 to 15 years ago.

Go to your site's Global Configuration and set caching to "ON - Progressive Caching". The progressive caching option is the better implementation of Joomla built-in caching, making sure that the output of each extension used to construct your page is individually cached. When a request comes through, the page is stitched together from those pieces of cached content whenever possible. This can even work towards mitigating some performance lost to a pre–built, non–optimized template. It will definitely work towards making your public, non–logged-in pages faster — exactly what is most relevant for your site's search engine ranking.

As to the caching back-end, most sites can get away with using the file caching which is similar in performance to memcached or Redis running on a decent, commercial, shared or virtualised, host — with much less memory use, therefore much cheaper to run. “Heresy!” the more technically minded between you cry. I agree with your sentiment, to an extent. If you have a truly massive or extremely busy site it makes sense to use a dedicated memcached or Redis server as your caching backend. It will be faster. Chances are that if you're reading this you do not, in fact, have this kind of site, and you're looking at speeding up a much more run–of–the–mill site. Even my own business site falls into this category, despite the fact that we're getting traffic in the order of hundreds of dozens of thousands of unique monthly visitors. That should give you a sense of the site scale that would benefit from caching using a dedicated caching server.

HTML compression

If there was a contest for the most-overlooked option in Joomla, Gzip Page Compression would win hands down. If you haven't already done so, go ahead and enable it.

This option makes sure that the HTML content sent by your site to the browser is compressed using the GZip (also called "deflate") algorithm. This reduces the total size of the data transferred to the client substantially. The amount of time saved in data transfer has a significant impact to your site's performance.

Does it not slow down the site? No, not really. The HTML pages generated by Joomla are in the few dozens of kilobytes size range. It takes a fractional millisecond to compress to around a third to a half of that size. With the typical transfer speeds between a host and your visitors this translates to a couple milliseconds of time gained. You gain two to three orders of magnitude more time than you lose in this case.

JavaScript and CSS compression

Many templates and third party plugins purport to save time by compressing your static files (JavaScript and CSS) on–the–fly. I strongly recommend that you do not use this kind of features. While compressing the static files does save transmission time, you end up with a net performance loss.

The reason for this counter–intuitive result requires talking about how servers deliver static and dynamic content. A properly set up web server caches often used static content in memory. Moreover, it uses advanced features of the Operating System such as memory mapping of files. These result in very speedy delivery of static content.

When you are using a PHP script to compress your static files, the web server has to hand over the request to the PHP executable. In the best case scenario (PHP FastCGI Process Manager a.k.a. PHP-FPM, with a big enough pool of processes and PHP OPcache enabled) this still wastes some time doing inter–process communication and resetting the PHP parser's state. The script needs to be confirmed as unchanged, its precompiled binary representation loaded and interpreted, executed by the PHP binary, the static file needs to be opened, its contents compressed and sent to Apache to deliver to the client. All of that takes dozens of milliseconds. Unless you are compressing a file well over several hundred kilobytes big the amount of time you lost is far greater — by one or two orders of magnitude! — to the amount of time gained by delivering a smaller, compressed file. Therefore, it's a net loss.

I strongly recommend doing this through your web server itself. If you are using Apache you can add the following to your .htaccess file:

<IfModule mod_deflate.c>
  AddOutputFilterByType DEFLATE text/plain text/xml text/css application/xml application/xhtml+xml application/rss+xml application/javascript application/x-javascript image/svg+xml
</IfModule>

<IfModule mod_gzip.c>
 mod_gzip_on Yes
 mod_gzip_dechunk Yes
 mod_gzip_keep_workfiles No
 mod_gzip_can_negotiate Yes
 mod_gzip_add_header_count Yes
 mod_gzip_send_vary Yes
 mod_gzip_min_http 1000
 mod_gzip_minimum_file_size 300
 mod_gzip_maximum_file_size 512000
 mod_gzip_maximum_inmem_size 60000
 mod_gzip_handle_methods GET
 mod_gzip_item_include file \.(html?|txt|css|js|php|pl|xml|rb|py|svg|scgz)$
 mod_gzip_item_include mime ^text/plain$
 mod_gzip_item_include mime ^text/xml$
 mod_gzip_item_include mime ^text/css$
 mod_gzip_item_include mime ^application/xml$
 mod_gzip_item_include mime ^application/xhtml+xml$
 mod_gzip_item_include mime ^application/rss+xml$
 mod_gzip_item_include mime ^application/javascript$
 mod_gzip_item_include mime ^application/x-javascript$
 mod_gzip_item_include mime ^image/svg+xml$
 mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.*
 mod_gzip_item_include handler ^cgi-script$
 mod_gzip_item_include handler ^server-status$
 mod_gzip_item_include handler ^server-info$
 mod_gzip_item_include handler ^application/x-httpd-php
 mod_gzip_item_exclude mime ^image/.*
</IfModule>

Your web server is much faster in compressing static media files. It can keep the compressed files cached in memory for speedier delivery next time around.

Joomla 4 also ups the ante by allowing you to compress your static files ahead of time with GZip, delivering the pre–compressed files instead of having your web server compress them on demand. It works like this. Let's say you have the JavaScript file media/com_example/js/something.min.js. Compress it with GZip into media/com_example/js/something.min.js.gz. When a browser requests the file media/com_example/js/something.min.js the web server will check its Accepts HTTP header to see if it supports GZip–compressed resources. If it does, it will deliver the media/com_example/js/something.min.js.gz file instead of the regular, uncompressed media/com_example/js/something.min.js file.

The prerequisite to that is that you rename the htaccess.txt file shipped with Joomla to .htaccess. Alternatively, if you are managing your own .htaccess file, make sure that the following code is included in your file:

## These directives are only enabled if the Apache mod_headers module is enabled.
## This section will check if a .gz file exists and if so will stream it
##     directly or fallback to gzip any asset on the fly
## If your site starts to look strange after enabling this, and you see
##     ERR_CONTENT_DECODING_FAILED in your browser console network tab,
##     then your server is already gzipping css and js files and you don't need this
##     block enabled in your .htaccess
<IfModule mod_headers.c>
        # Serve gzip compressed CSS files if they exist
        # and the client accepts gzip.
        RewriteCond "%{HTTP:Accept-encoding}" "gzip"
        RewriteCond "%{REQUEST_FILENAME}\.gz" -s
        RewriteRule "^(.*)\.css" "$1\.css\.gz" [QSA]

        # Serve gzip compressed JS files if they exist
        # and the client accepts gzip.
        RewriteCond "%{HTTP:Accept-encoding}" "gzip"
        RewriteCond "%{REQUEST_FILENAME}\.gz" -s
        RewriteRule "^(.*)\.js" "$1\.js\.gz" [QSA]

        # Serve correct content types, and prevent mod_deflate double gzip.
        RewriteRule "\.css\.gz$" "-" [T=text/css,E=no-gzip:1]
        RewriteRule "\.js\.gz$" "-" [T=text/javascript,E=no-gzip:1]

        <FilesMatch "(\.js\.gz|\.css\.gz)$">
                # Serve correct encoding type.
                Header append Content-Encoding gzip

                # Force proxies to cache gzipped &
                # non-gzipped css/js files separately.
                Header append Vary Accept-Encoding
	</FilesMatch>
</IfModule>

Joomla 4 provides the pre–compressed .gz files for all of its static JavaScript and CSS files in its distribution. Enabling this .htaccess trick will make your site even faster with minimal effort. Awesome, isn't it?!

Static media caching

Reducing the size of the static media with compression is half the battle and matters primarily for first time visitors. When someone comes back to your site it makes sense for their browser to serve static media from the browser's cache, without hitting the network at all. If you're using Apache you can use the following code in your .htaccess file:

<IfModule mod_expires.c>
 # Enable expiration control
 ExpiresActive On

# CSS and JS expiration:
 ExpiresByType text/css "now plus 1 year"
 ExpiresByType application/javascript "now plus 1 year"
 ExpiresByType application/x-javascript "now plus 1 year"

# Image files expiration: 1 month after request
 ExpiresByType image/bmp "now plus 1 year"
 ExpiresByType image/gif "now plus 1 year"
 ExpiresByType image/jpeg "now plus 1 month"
 ExpiresByType image/jp2 "now plus 1 month"
 ExpiresByType image/pipeg "now plus 1 month"
 ExpiresByType image/png "now plus 1 month"
 ExpiresByType image/svg+xml "now plus 1 month"
 ExpiresByType image/tiff "now plus 1 month"
 ExpiresByType image/vnd.microsoft.icon "now plus 1 month"
 ExpiresByType image/x-icon "now plus 1 month"
 ExpiresByType image/ico "now plus 1 month"
 ExpiresByType image/icon "now plus 1 month"
 ExpiresByType image/webp "now plus 1 month"
 ExpiresByType text/ico "now plus 1 month"
 ExpiresByType application/ico "now plus 1 month"
 ExpiresByType image/vnd.wap.wbmp "now plus 1 month"
 ExpiresByType application/vnd.wap.wbxml "now plus 1 month"
 ExpiresByType application/smil "now plus 1 month"
 
 # Font files expiration: 1 week after request
 ExpiresByType application/vnd.ms-fontobject "now plus 1 week"
 ExpiresByType application/x-font-ttf "now plus 1 week"
 ExpiresByType application/x-font-opentype "now plus 1 week"
 ExpiresByType application/x-font-woff "now plus 1 week"
 ExpiresByType font/woff2 "now plus 1 week"
 ExpiresByType image/svg+xml "now plus 1 week"

# Audio files expiration: 1 month after request
 ExpiresByType audio/ogg "now plus 1 month"
 ExpiresByType application/ogg "now plus 1 month"
 ExpiresByType audio/basic "now plus 1 month"
 ExpiresByType audio/mid "now plus 1 month"
 ExpiresByType audio/midi "now plus 1 month"
 ExpiresByType audio/mpeg "now plus 1 month"
 ExpiresByType audio/mp3 "now plus 1 month"
 ExpiresByType audio/x-aiff "now plus 1 month"
 ExpiresByType audio/x-mpegurl "now plus 1 month"
 ExpiresByType audio/x-pn-realaudio "now plus 1 month"
 ExpiresByType audio/x-wav "now plus 1 month"

# Movie files expiration: 1 month after request
 ExpiresByType application/x-shockwave-flash "now plus 1 month"
 ExpiresByType x-world/x-vrml "now plus 1 month"
 ExpiresByType video/x-msvideo "now plus 1 month"
 ExpiresByType video/mpeg "now plus 1 month"
 ExpiresByType video/mp4 "now plus 1 month"
 ExpiresByType video/quicktime "now plus 1 month"
 ExpiresByType video/x-la-asf "now plus 1 month"
 ExpiresByType video/x-ms-asf "now plus 1 month"
</IfModule>

A reasonable question is what happens if you update Joomla and/or third party extensions. The static files — JavaScript, CSS, images, ... — change as part of the update. We don't want the browser to use the old file. At best the site will look weird, at worst it will be broken for the visitor. That's where media versioning with query parameters comes into play. If you look at your site's source code you will see lines like this:

<link href="/media/plg_system_webauthn/css/button.min.css?f15d039055248502c1a41bc99a31c0f3" rel="stylesheet">

That ?f15d039055248502c1a41bc99a31c0f3 is called a media versioning query. The stuff after the question mark doesn't matter, as long as it changes every time the static file changes. Joomla — and correctly written third party extensions — do that automatically for CSS and JavaScript files. If you include other static content in your articles such as images, videos etc remember to add a versioning query. Something as simple as ?20211205111300 (a question mark followed by the year, month, day, hour, minutes and seconds — the time you wrote that query) is more than adequate.

HTTPS and HSTS

There's a common misconception that HTTPS has something to do with securing your site, it's expensive, it's slow, and you don't really need it unless you're doing e-commerce or something. Another misconception is that it makes your site slower.

These myths originated in the late 1990s. Over two decades ago they are patently false.

HTTPS is pretty much mandatory these days. If you don't use HTTPS your site will appear with a big, red warning telling your visitors it is insecure, scaring visitors away. It will be penalized by search engines. You should use HTTPS if only to fix these two problems. You don't even need to break the piggy bank. TLS certificates are now free of charge thanks to Let's Encrypt. Most hosting control panels integrate with Let's Encrypt, meaning that you can literally have your hosting control panel issue and install a free TLS certificate and auto-renew it. There is zero maintenance on your part. HTTPS is also super–fast since any modern CPU, released over the past ten-odd years, has hardware acceleration for the cryptographic operations it uses.

While you're at it, remember to set “Force HTTPS to Entire Site” in your Global Configuration. This ensures that your Joomla site will always be delivered over HTTPS, making logins more secure in the process. Once you do that, and you've confirmed HTTPS works great with your site, add the following to your .htaccess:

<IfModule mod_headers.c>
 Header always set Strict-Transport-Security "max-age=31536000" env=HTTPS
</IfModule>

This enables a feature called HSTS (HTTP Strict Transport Security). In short, it tells your browser to never even try to connect to the HTTP version of your site, regardless of what your visitor tells it to do. Since this happens on the browser side a visitor who types your domain name in the address bar without the https:// prefix, or clicks a link with the http:// prefix, will always get to the HTTPS version of your site without having to first visit the plain HTTP version and get redirected by Joomla. This is much faster, especially on high-latency connections such as mobile or satellite Internet.

A further optimization you can do is submit your site to the HSTS Preload List. While HSTS only works after the first time someone visits your site, having your site in the HSTS Preload List means that the browser knows about your site using HSTS before the first time your visitor visits it. Therefore, the browser will never attempt to load it over plain HTTP. Again, this is a time saver for high-latency connections, easy and free. What's not to love about it?

HTTP/2 Server Push

When talking about making Joomla faster in the past I used to tell people how to enable HTTP/2 Server Push to make sites faster. However, the Google Chrome developers have already proposed removing support for it and stated that it will anyway not be implemented for the HTTP/3 protocol at all. Therefore, my current advice is to not even bother with it.

To be continued

This is part two of a five–part series. Part III: Static media optimisation will be made available in the January 2022 issue of the Joomla Community magazine.


German translation of this article: https://www.jug-zueri.ch/artikel/performance-tuning-in-joomla-teil-2-grundeinstellungen

Spanish translation of this article: https://mejorconjoomla.com/noticias/magazine/puesta-a-punto-del-rendimiento-de-joomla-4-ajustes-basicos

Copyright

© Copyright ©2020–2021 Nikolaos Dionysopoulos. All legal rights reserved.

2
Meet the CMS Release Team
My favourite Joomla 4 feature - Extendibility (Sør...
 

Comments 7

Already Registered? Login Here
Maurice Molenaar on Tuesday, 21 December 2021 09:31
Great!

Thanks Nicholas, great article!

0
Thanks Nicholas, great article!
George Ploumakis on Tuesday, 21 December 2021 10:38
Great as always

Thank you for your remarkable, high level, vital and long lasting contribution to Joomla community.

0
Thank you for your remarkable, high level, vital and long lasting contribution to Joomla community.
Alan N on Wednesday, 20 April 2022 07:29
A must read article

A super useful article Nicholas - thank you!

0
A super useful article Nicholas - thank you!
Pieter-Jan de Vries on Saturday, 15 October 2022 10:52
Is HTML compression insecure?

Not sure if anyone reads a comment on an almost year old article, but you never know.

Recently, I came across information stating that compression of dynamically generated content is unsafe and vulnerable to CRIME and BREACH attacks. I guess this vulnerability only applies to pages containing sensitive content, but still I'm curious if Nicholas has ideas about this, in relation to the passage on "HTML compression"?

0
Not sure if anyone reads a comment on an almost year old article, but you never know. Recently, I came across information stating that compression of dynamically generated content is unsafe and vulnerable to [url=https://en.wikipedia.org/wiki/CRIME]CRIME[/url] and [url=https://en.wikipedia.org/wiki/BREACH]BREACH[/url] attacks. I guess this vulnerability only applies to pages containing sensitive content, but still I'm curious if Nicholas has ideas about this, in relation to the passage on "HTML compression"?
Nicholas K. Dionysopoulos on Saturday, 15 October 2022 13:56
Let's not get too paranoid here

Instead of reading the oversimplified version in Wikipedia do read the actual CVE https://www.cve.org/CVERecord?id=CVE-2012-4929 This was an implementation issue in TLS 1.2 and earlier. So, it requires an obsolete version of TLS, one which has a bug which was reported and fixed ten years ago (note that the bug requirement is why nginx was not affected), and a man-in-the-middle attack launching several thousands of requests with very good timing measurement (which precludes JavaScript and browser extensions in modern browsers). BREACH requires an unpatched gzip library from 9 years ago. I would argue that if you are using something which is failing to meet security best practices, has not been updated in 10 years and are on a compromised network the CRIME and BREACH attacks are the least of your worries.

No, I would not turn off compression at the server level to "mitigate" a problem which has been solved at the TLS protocol implementation ten years ago and definitely in TLS 1.3 (the current TLS version since 2018).

If you want to be paranoid, did you know that there are proven attacks and papers published about analysing keystrokes from deauthing Bluetooth keyboards, measuring the EM fields prom unshielded (i.e. most if not practically all) USB keyboard cables, analysing typing sounds or using infrared photography to analyse the heat map of keys (apparently ABS keys used in more keyboards are far easier to analyse for thermal mapping than PBT keycaps)? Most of the same apply for mouse input. The reflection of your screen on your glasses when having an on-line meeting can be analysed to get a view of your screen at roughly 600x400 pixels resolution. The sub-mW variation of your computer's power consumption can be used to infer what you are doing or to exfiltrate data from air gapped computers. If you consider all possible attacks against a computer you will arrive to the conclusion made as a quip by a former Director of the CIA: a computer is secure when it's shredded, melted, the slug made to dust and buried 6 feet under in an undisclosed location. But we're not all using our computers to handle Top Secret material, the disclosure of which might be detrimental to our nation's security or result in a thermonuclear war that wipes out humanity. So let's not join the tinfoil hat brigade by overreacting to what is possible versus what is plausible.

Speaking of possibility versus plausibility, let's talk about the first rule of security: update everything, yesterday. All of these attacks are well-known and already mitigated in server software and common software libraries. As long as you use an up-to-date server environment which uses currently maintained software at their latest patch level you can safely use compression and you are fairly secure in the sense that you have been protected against known attack vectors. Sure, you may have a security issue nobody has discovered yet or at least one that's not been publicised yet, but there's sod all you can do about it beyond standard security practices which are mostly useful to help you do a post-mortem after a successful attack. As long as your clients are using an up-to-date browser on a maintained and up-to-date Operating System without insecure / compromised browser extensions they will be safe on their end as well. If either end is still using out-of-date software one or both parties will eventually have a very bad day.

1
Instead of reading the oversimplified version in Wikipedia do read the actual CVE https://www.cve.org/CVERecord?id=CVE-2012-4929 This was an implementation issue in TLS 1.2 and earlier. So, it requires an obsolete version of TLS, one which has a bug which was reported and fixed ten years ago (note that the bug requirement is why nginx was not affected), and a man-in-the-middle attack launching several thousands of requests with very good timing measurement (which precludes JavaScript and browser extensions in modern browsers). BREACH requires an unpatched gzip library from 9 years ago. I would argue that if you are using something which is failing to meet security best practices, has not been updated in 10 years and are on a compromised network the CRIME and BREACH attacks are the least of your worries. No, I would not turn off compression at the server level to "mitigate" a problem which has been solved at the TLS protocol implementation ten years ago and definitely in TLS 1.3 (the current TLS version since 2018). If you want to be paranoid, did you know that there are proven attacks and papers published about analysing keystrokes from deauthing Bluetooth keyboards, measuring the EM fields prom unshielded (i.e. most if not practically all) USB keyboard cables, analysing typing sounds or using infrared photography to analyse the heat map of keys (apparently ABS keys used in more keyboards are far easier to analyse for thermal mapping than PBT keycaps)? Most of the same apply for mouse input. The reflection of your screen on your glasses when having an on-line meeting can be analysed to get a view of your screen at roughly 600x400 pixels resolution. The sub-mW variation of your computer's power consumption can be used to infer what you are doing or to exfiltrate data from air gapped computers. If you consider all [i]possible[/i] attacks against a computer you will arrive to the conclusion made as a quip by a former Director of the CIA: a computer is secure when it's shredded, melted, the slug made to dust and buried 6 feet under in an undisclosed location. But we're not all using our computers to handle Top Secret material, the disclosure of which might be detrimental to our nation's security or result in a thermonuclear war that wipes out humanity. So let's not join the tinfoil hat brigade by overreacting to what is possible versus what is plausible. Speaking of possibility versus plausibility, let's talk about the first rule of security: [b]update everything, yesterday[/b]. All of these attacks are well-known and already mitigated in server software and common software libraries. As long as you use an up-to-date server environment which uses currently maintained software at their latest patch level you can safely use compression and you are fairly secure in the sense that you have been protected against known attack vectors. Sure, you may have a security issue nobody has discovered yet or at least one that's not been publicised yet, but there's sod all you can do about it beyond standard security practices which are mostly useful to help you do a post-mortem after a successful attack. As long as your clients are using an up-to-date browser on a maintained and up-to-date Operating System without insecure / compromised browser extensions they will be safe on their end as well. If either end is still using out-of-date software one or both parties will eventually have a very bad day.
Pieter-Jan de Vries on Saturday, 15 October 2022 15:39
I stand corrected

Thanks for your quick and comprehensive reply. Very much appreciated. To be honest, I didn't even know about this alleged security issue, until a colleague brought it up a few days ago. The assumption was it could be bad for search engine ranking, of which I haven't been able to find proof as of yet. At least I know what to say if anyone else ever brings it up again, although it will be quite difficult to reproduce this elaborate report by heart

0
Thanks for your quick and comprehensive reply. Very much appreciated. To be honest, I didn't even know about this alleged security issue, until a colleague brought it up a few days ago. The assumption was it could be bad for search engine ranking, of which I haven't been able to find proof as of yet. At least I know what to say if anyone else ever brings it up again, although it will be quite difficult to reproduce this elaborate report by heart :D
Nicholas K. Dionysopoulos on Saturday, 15 October 2022 21:17
Simple answer

The simplest answer you can give to your colleague is this: “This was a problem in 2012 and has been fixed since then — that's over ten years ago. If our servers are running software which is ten or more years out of date, this would be the least of our worries”.

0
The simplest answer you can give to your colleague is this: “This was a problem in 2012 and has been fixed since then — that's over ten years ago. If our servers are running software which is ten or more years out of date, this would be the [i]least[/i] of our worries”.

By accepting you will be accessing a service provided by a third-party external to https://magazine.joomla.org/