You need to know the rules in order to break them successfully

Benchmarking

Loved by consultants, loathed by managers, benchmarking is a powerful tool for evidence-based decision making.

Look Outside

The essential point of benchmarking is to look outside your organisation and see what happens elsewhere.  It sounds simple – idiotically simple – if it wasn’t for the fact that so all-consuming are the daily operations inside our business that we just don’t have the time to gaze outwards for very long.  Consequently our decisions have an inward-looking bias.

A tell-tale sign of this bias is the internal names companies give their products, markets and territories.  If these are very different from the way their customers’ view their products, it’s very likely the company has its nose a little too close to the grindstone.  If a company thinks it’s selling ‘Yellow Fats’ and its consumers think they’re buying ‘Margarine’, there may be a problem.

Benchmarking is part of a suite of tools available to companies to access knowledge of their external environment.  Customer surveys, market research, hiring new staff, even reading the newspapers, are all part of the armoury.  Like all tools however, it’s the skill with which you wield them that’s important.

Loathed by Managers

I once took over financial responsibility for a research lab with a target that its overheads be no greater than 20% of total costs.  The monthly accounts duly reported actual overheads of about 18% each month.  However, when I unpicked various internal cost recoveries and looked at the cost base from the point of view of the customers, it was clear the true level of overhead was over 33%.   It was a classic example of behaviour adapting to a performance target, but that’s not the point of my story.   Over the course of the next seven years we slowly brought the true level of overhead down to about 22%.  I thought we’d done quite well.  Then we did some benchmarking.  Gosh.  We hadn’t done nearly as well as others.  A sister laboratory was achieving less than 20%.  We weren’t the worst, but we weren’t the best either.  I can’t tell you how deflating that was.  We then embarked on a couple more rounds of cost-cutting.  When I left that job the overheads were about 17%.  I imagine utility costs have caused a lot of damage since then, but I wouldn’t be surprised if, even so, the overheads are a lot lower than 17% now.

Benchmarking is relentless.  You’re never good enough, and even if you are, it won’t be for long.  If you’re doing a good job, you agonise about the reasons why you’re adrift and suspect others might not be entirely candid with their reporting, but eventually look afresh at different ways of working.  If you’re doing a bad job, you invent plausible excuses why being adrift is to be expected and change nothing, or worse, start ‘massaging’ the figures, which is completely nuts.  Managers loath benchmarking – of course they do.

Loved by Consultants

Consultants on the other hand, and I speak as poacher turned gamekeeper, love benchmarking.  A consultant has only the shallowest conception of your business but this need not matter.  If he or she has access to data from a large number of similar businesses, they have knowledge that isn’t available to managers who might only the haziest picture of the outside world.  The evidence from benchmarking is often a neutral position from which to start.

Power of Benchmarking

The power of benchmarking is that, using objective statistics, it can completely change your frame of reference.  Here’s an example.  In a recent engagement I was responsible for the design, construction and occupation of a new research building.  From start to finish it took 4 years.  We finished ahead of time and below budget and thought we’d done quite a good job.  The boss came back from a trip to China to discover the Chinese were putting up similar buildings in about half the time.  My immediate reaction was defensive but I had to admit to myself that we had gone about the job in a very ‘old world’, cautious, sign-off at each stage, blah, blah, manner.  If we had wanted to, we could have taken a very different approach and, I’m sure, been able to do things much quicker.  I was forced to see that there was a completely different approach to doing what I had just done.  Without the challenge from that Chinese benchmark, I would have been none the wiser, believing that, while there might have been incremental improvements that could have been made, broadly we’d done it at a reasonable pace.

Practical Pitfalls of Benchmarking

The obvious problem with benchmarking is the quality of data you are using for comparison.  It’s unlikely you’ll ever want or need to be 100% confident you are comparing like for like but if you are below 80% confidence, the usefulness of benchmarking starts to diminish.  If you have no way of validating your benchmarks then making comparisons can still be ‘interesting’ but they can never be the basis of decision-making.

However, beware of doing too much data cleansing.  It is tempting, and local managers will insist, that adjustments are made to the raw data in order to make fair comparisons.  There is certainly a lot of truth in this and some degree of data cleansing is advisable.  However, if left unchecked, there can be so many adjustments for special cases that almost anything can be finessed to appear very close to almost anything else.  There is value in those special cases and they can be left in the numbers in some cases.

Why people might want to eliminate all differences illustrates another pitfall of benchmarking: slavishly following the benchmarks.  Benchmarks are just one of several sources of evidence and not the most compelling.  The most compelling will usually come from a challenging line manager but, in the absence of self-challenge, benchmarks are a good place to the start of a conversation.

Arithmetic is another pitfall to be wary of.  Benchmarks are frequently expressed as a proportion of something else.  For instance, construction cost per m2, or overheads as % of sales, or IT costs per workstation.  In each of these examples, you might be confident you are comparing like for like in terms of the primary data source – construction costs, overheads, IT costs – but make some basic error in terms of the denominator.  For instance, construction cost per m2 is dangerous because floor space is measured in all sorts of different ways.  And what is exactly is a ‘sale’ or ‘workstation’ is not always as straightforward as it sounds.  Relatively small errors in your arithmetic can lead to big differences in the direction of your conclusions on the benchmarks that rely on your calculations.

The final pitfall is the hardest to spot without inside knowledge of how the raw data has been prepared.  Being polite, it is the problem of data being prepared in such a way that flatters comparisons.  I once tried to benchmark the estates costs of an NHS hospital and found a treasure trove of an NHS database.  It contained lots of different metrics of every hospital in the country, so you could compare your hospital with hospitals of equivalent size in the same geographical area.  However, when I looked at the returns for our hospital and compared them with the raw data I could verify locally, the numbers we had returned were not reliable.  When I enquired about the reason for this, I was told ‘We put in numbers that look broadly similar to the previous returns’.  OK, this says more about the management than anything else but it illustrates the risks to be aware of when relying on benchmarking data.

Examples of Benchmarking

These are some examples of how benchmarking can be used in practice:

Overheads % Income.  I’ve been benchmarking overheads for almost my whole career.  In the very first occasion, as a young management accountant, the sales and marketing team were complaining the overheads in our company were too high.  I was tasked to find out the truth.  I quickly established that our overheads as a % of sales at 16% was significantly higher than the 12% that was common in the corporate group.  However I also pointed out that the solution that was most commonly advanced for reducing our overheads – cutting back on travel and similar budgets – would make hardly a dent in our cost base.  If we were serious about reducing overheads we needed to cut staff numbers.  What seems like a pretty obvious management consultant-type recommendation was regarded as revolutionary at the time and I got credited with having some sort of powerful insight beyond my years.  Yet, it was simple arithmetic.  It must have taken me a day’s research to establish the corporate benchmark (which I guessed from a very small sample) and an afternoon digging around in the finances.  My more senior colleagues were pretty smart people, but none had done this type of analysis.

The chart below compares overheads in the Higher Education sector.  In this chart, the average overhead % of income is over 40% which seems staggering compared with the previous example but, really, it’s a slightly unfair comparison.  ‘Income’ in the HE sector is really the equivalent of ‘Total Costs’ in the commercial sector.  However the comparison is only slightly unfair because I’ve heard academics boast of how high their overheads are, because it shows how little administration they are expected to do themselves.  This culture is the opposite of a commercial enterprise.

Picture1

Building construction costs per square meter.  A simple error in the benchmarking arithmetic at the beginning of a construction project put the project under cost pressure for years afterwards.  The chart below is one of many subsequent repeated attempts to understand construction cost benchmarks by referencing buildings that were familiar to our steering group.  The conclusion is that our construction cost was about average but originally we aimed for upper second quartile.  We were forced to make compromises to the building design as a result.

 

 

Pricing; the chart below was used to benchmark the commercial value of academic research services.  These high value-added services were being priced at about the same rate per hour as a semi-skilled tradesman.  These were services using multi-million pound equipment, run by PhDs, often one of only a handful of places in the country with equivalent skills.  Actually, it is horribly common for academics to have no concept at all of the commercial value of the services they provide.   Or worse, to actively reduce their prices for fear of being too expensive!

 

 

Words of Warning

Benchmarking is more powerful as a tool of strategic analysis than it is a regular ‘information system’.  In other words, doing it every few years will have more impact than every month.  Because the information it yields is so useful, it is tempting to incorporate the data into regular KPIs and include it in annual performance targets.  Resist this temptation because the technique definitely suffers from the law of diminishing returns.

Summary

Benchmarking is a way of shocking you out of your comfort zone.  It forces you to realise things you had previously thought were fixed and unchangeable are, in fact, flexible and easily changed.  That is a powerful analytical tool.  Using it in more mundane ways such as monthly reporting of KPIs is not exactly a misuse of the technique; more like using the wrong tool for a job.

 

View all blog articles