Subscribe to

The Unconventional Guide to Work

Life’s too short to be bored at work. We can help.

Making the boat go faster

Back in 1995 when Team New Zealand won the America's Cup their campaign was focused on a single value - "Does it make the boat go faster?".  Before undertaking any activity they would ask this simple question.  If the answer was no, it wasn't worth doing.

At WorkflowMax, we've adopted this value along with our core focus of building job management software that is simple and easy to use.

As the WorkflowMax service has grown we've started to notice some performance bottle necks with our original architecture of the application.  A couple of months ago we decided to undertake a project to focus on improving the general performance of WorkflowMax.

Before you can improve the performance though, you first need a benchmark to compare against.  As part of our standard application monitoring, we log the execution time for every page request made to WorkflowMax.  This allows us to easily identify the requests which take a long time to execute as well as determine metrics such as average page execution time.  

The requests taking a long time to execute are generally easy to fix.  Most often it's a case of optimising the database query or tweaking the application logic.  The good thing with these types of performance issues is you can measure these against the benchmark and any performance improvements are obvious.  Thankfully there were not too many of these issues in WorkflowMax and we've managed to resolve most of them.  We do feel there is still room for further improvement which we will continue to review over the upcoming months.

Reducing the average page execution time though required further analysis of what was happening inside the application during each page request.  By enabling query logging on the database in our development environment we were able to identify several common database queries which executed for every page request.  These requests returned the same result each time indicating they were prime candidates to be stored in a cache. Some examples of these queries include determining user permissions and which application modules are enabled for a users account.  By introducing Memcached as our caching tier, we were able to reduce the number of queries to the database per page request and thus reduce the loading on the database itself.  

Over the past several weeks we've incrementally grown our caching tier to a point where combined with our other performance improvements, we've reduced the average page execution time from 190 milliseconds down to 105 milliseconds.  This is a significant performance gain that has allowed us to process more requests per second which utlimately leads to faster page generation and through put on our servers. 

Having a benchmark as a reference point has enabled us to measure which changes have or haven't resulted in a performance improvement.  By releasing these improvements incrementally to production over several weeks we've been able to measure which changes had more of an impact than others giving us valuable knowledge which we can apply to our future development strategy and design.  The really pleasing aspect to all this is that the performance improvements have occured at the same time the number of people using WorkflowMax has increased.

Going forward, in addition to delivering new functionality for our users, we'll be devoting time to each release to ensure the boat continues to get faster and doesn't get becalmed in the water.