Pale Purple https://www.palepurple.co.uk Office Address Registered Office Blount House, Hall Court, Hall Park Way,,
Telford, Shropshire, TF3 4NQ GB
sales@palepurple.co.uk GB 884 6231 01
How to improve WordPress/Apache performance on a relatively heavily loaded server using Varnish and more
One server (8 cpu cores, 32gb of memory, RAID5 disks) running WordPress (PHP 5.3.x and MySQL 5.1.x). WordPress site(s) would often grind to a halt as the server struggled to respond to requests quickly enough.
Lack of available I/O capacity, identified by :
Of the above, use of Varnish has had the biggest impact on disk I/O – resulting in a significant decrease. It also led to a significant decrease in the number of Apache processes in use (each of which is relatively memory hungry) – this is possible as a large part of the site is static content (images/stylesheets/javascript) and unchanging.
Here are some graphs taken from the server, which give some idea of the impact Varnish has had.
Firstly, we have the number of accesses registered by Apache – before and after Varnish was introduced. Because Varnish sits infront of Apache, when it is introduced the workload of Apache drops.
This next graph shows the hit rate logged by Varnish – i.e. the number of hits per second it deals with. As varnish is made ‘live’ it’s hit-rate increases while Apache’s decreases.
Before the 07th, Varnish was only accessible for testing, on/after the 7th the site’s DNS entries were changed to route traffic through Varnish. Further configuration changes were made to improve the cache hit rate.
Finally, we have the I/O graph – note how initially varnish doesn’t help with the I/O load on the server (if anything it makes it worse between the 7th and 10th).
On the 10th Varnish was reconfigured to use the malloc backend – at which point the I/O load drops down and appears to remain more consistent.
‹ Everything’s turning mobile – a talk given to BCS Birmingham in April 2012 Faster is better – rethinking SQL queries to improve API response times ›
Hi there. Thanks a lot! Excellent article.