So your site is slow? You search the web and find this great alternative web server or database software. You dig into the benchmarks and they show huge improvements. CPU loads drop in half, throughput doubles, and you think your problems. You ditch Apache and MySQL only to find performance still suffers. You should have probably ignored those worthless benchmarks. We often have to when we run through our server performance optimization service.

Worthless Benchmarks

Too often, we filter request for installing alternative solutions like Nginx, Sphinx, Memcache, Xcache or Percona DB. In many cases, these requests are premature as the true performance problem is not understood.

Until you know where your bottleneck is located, the benchmarks showing these great improvements will often not translate to your operations. The key to solving performance issues is identifying the most critical constraint points and fixing those first. Trimming 50ms from an HTTP request will not help your web application if your SQL query is taking 10 seconds.

HTTP benchmarks showing one server is faster than another may only help you IF your HTTP server is slowing things down. If not, then you can pretty much ignore those benchmarks for now.

Wrong Comparisons

When you look at benchmarks, you will often find they focus on maximal throughput.

From transactions per second with databases to requests per second with web servers, you will see benchmarks touting huge improvements. If the comparisons are done well, they will correlate this throughput with outer resource metrics, such as disk IO, CPU and RAM utilization. However, this is often missing the point.

The important benchmark is how well the alternative solution performs at your expected levels of operation. Showing an alternate database server processing 40% more transactions per second at maximal loads does not necessarily mean that it is 40% faster at the loads you expect.

Better benchmarks show how alternative solutions compare as throughput increases. Typically, what you will find is that at low volumes most solutions are nearly equivalent. For example, in some Percona DB testing at low throughput levels the difference between Percona’s database and MySQL is small.

To optimize site performance, you must focus on performance differences at your expected utilization levels and ignore max value comparisons.

Wrong Metrics

Web performance optimization is hard. I suspect this is why so many people look for quick fixes in alternate software packages.

Another mistake I see is overly focusing on the wrong metrics. Request or transactions per second are great as they are easily measured and easily understood. You can tweak here and there and see real results. However, often these metrics are not that important.

If you want to optimize your site, you need to put yourself in your user’s position. What do they care about? How fast your database can process requests? Unlikely. For web performance, the key issue is page speed. I’ve seen many optimization efforts go awry due to myopic focus on the wrong metrics.

Services like BrowserMob let you test real world performance with real browsers. For me, that is the best metric to use. The server level details are clues to where you can make improvements.

Good Benchmarks

If you want to truly optimize your site, you need to develop user-centric benchmarks. Focus on those items that your users do the most. What do they value in your web application? Sure fast is important but how the page renders may also matter. Start from their end and work your way into the application. Find ways to measure each step in the process.
With this approach, you can develop real-world benchmarks specific to your application. This way when you drop in that alternate HTTP server (we like Nginx), you know it is having the desired impact for your users.

Your Turn

What do you hate or love about web performance optimization? Any good benchmarks you have found that truly impact the user experience?

Menu