Improving website performance – 10 tips
November 11, 2013, palepurple, performance, php, , 0
While desktop users are generally getting faster and faster internet connections, it’s still the case that optimising a page’s generation and delivery can lead to a significantly better user experience. This is especially the case for mobile users – who are often on a relatively low speed and/or high latency connection.
With this in mind, here are 10 tips which should help improve real or perceived website performance on a Linux/Apache/MySQL/PHP (LAMP) based website –
- Enable deflate/output compression within Apache – less data being transmitted to the client will reduce page load time.
- Enable a PHP op-code cache (e.g. xcache) – this will generally reduce the time taken to generate a page on the server – as it will not need to recompile the source code on each request.
- Reduce the number and size of requests to the server (e.g. merging javascript or css files together, css image spriting or serving correctly sized images). Setting up, and issuing new HTTP requests takes time – especially on a high latency relatively slow mobile connection. See also Google’s mod_pagespeed.
- Implement server side application caching, where appropriate (hopefully using something like xDebug to identify bottlenecks within the application first). Expensive operations (like a complex database query or talking to a remote REST services) can be cached locally or perhaps even pre-populated. Your bottlenecks may be intermittent – such as a poorly configured database affecting overall performance when a specific action is being carried out by another user.
- Ensure you have sufficient server capacity (memory, CPU, disk bandwidth) – all of the above steps will probably pale in comparison to a server swapping (low on RAM). Using tools like Munin may help identify resource issues.
- Minimise CSS and Javascript assets (for example using YUI Compressor) – this will reduce the payload size even if you’re using output compression. You may need to use some sort of ‘build’ script (ant, phing, make etc) to ensure the compressed assets are up to date when deploying and updating the site.
- Offload images / css / javascript onto separate domains (or perhaps use a content delivery network) to maximise the number of parallel requests the client browser can make – most browsers will not issue more than a handful concurrent requests to a single domain name – using multiple domain names can work around this – so content can be retrieved quicker. Correctly configured HTTP expiry headers can be used to reduce the number of subsequent connections return visitors make.
- Use a job queue to reduce blocking within the user interface – can a lengthy task be run ‘soon’ (and not ‘right now’)?
- Move CSS and JS towards the bottom of pages if possible. Try to make JS non-blocking where possible. This should help the page load event blocking the rendering of the content.
- Assuming you’re using PHP – use the latest possible version – there have been significant performance improvements in each of the last major releases (5.3 to 5.4 and 5.5).
apache, javascript, php, speed, xdebug