Reasons why your financial model has been consigned to the scrap heap

You have received a call from a client. The client wants a pricing tool or some other financial model built. The model must add value to the client; it merely can’t rehash and beautify the client’s legacy spreadsheets. The notion of value-adding is open to discussion; for example it could be forecasting and valuing the client’s company, approved
producing scenario analysis, this web
or it may be simply undertaking sensitivity analysis. This article will focus on the value-adding power of a sensitivity analysis in a pricing model.

Unlike a more macro-approach of a client’s financial model, a pricing tool will compute expected financial returns of a company delivering a service; based on price, units of sale, and direct and indirect costs. Allocating indirect costs incrementally based on activity-based costing is often overlooked in price decision-making, when a service company is aiming for a specific return on sales.

The value of sensitivity analysis

The value-added benefit of a sensitivity analysis for a service company is the ability to compare a preliminary price, compared to hypothetical changes in certain key drivers; in order to compare expected financial returns. One example depicted in the screen following is the ability for the client to increase gross income from $131,212 to around $200,000, based on expanding course enrolments from 22 students up to 34 students using Solver.

Dashboard to compare sensitivity outputs against base case

Your financial model can go one step further. It can leverage its existing dashboard functionality and deliver further value to your client. A great example is a dashboard that can compare the sensitivity outputs against base case, which in this case is a service company’s base service price, sales volumes etc.  The following Waterfall Chart can demonstrate this. This model can generate such a chart in terms of Gross Profit, EBIT, Net Profit After Tax (NPAT) or Operating Cash Flow; in this instance the Gross Profit option has been selected to reveal the additional 12 students increases Gross Profit from $88,234.8 to $148,667.2.

The essence of sensitivity analysis is such that users may only want to get an understanding of how revenue or cost drivers influence returns in a financial model. The sensitivity analysis of a model should facilitate this user-friendly flexibility. The following enables users to understand how a mere 10% decrease in the Course Enrolment Fee, a 10% increase in the Course Curriculum Fee, and a 10% decrease in Students Enrolled; will generate small decreases in NPAT (from  $49,834.8 to $49,704.0) and Operating Cash Flow ($50,798.4 to $50,667.6).

Illustrate financial impact with Waterfall Charts

Again, the model can go one step further and illustrate its financial impact in a Waterfall Chart. The above example of Operating Cash Flow falling slightly from $50,798.4 to $50,667.6 due to changes in Course fees and Students Enrolled is depicted in the following.

How about if there is a 25% increase in Course Curriculum Fee per Student, assuming the other two changes remain constant; as you can see Operating Cash Flow moves materially from $50,798.4 up to $61,722.6.

The two different approaches, solver and arbitrary changes to specific drivers, highlight the benefits of a sensitivity analysis feature in a financial model. It provides an important tool for executives when they are faced with making financial decisions; either from the more micro aspect of pricing a specific service (as depicted), or more macro decisions such a divestment or a debt-raising.

Remember corporate executives are time poor and aren’t interested in trawling through pages and pages of worksheets. Sure they still need to understand the fundamentals via a high-level dashboard, but often they want to refer to a pictorial format such as a waterfall chart.

Concluding the insensitivity benefits of your financial model

The application of a sensitivity analysis is an integral feature of any type of financial model. It is a great way to turbo-charge the added-value that you can deliver to a client or inside your company. No financial model can merely regurgitate financial statements via an accounting system. It needs to deliver intrinsic value and push the envelope, in terms of offering management flexibility to operate the model and get a better sense of the revenue or cost drivers influencing financial returns to their company.
You have received a call from a client. The client wants a pricing tool or some other financial model built. The model must add value to the client; it merely can’t rehash and beautify the client’s legacy spreadsheets. The notion of value-adding is open to discussion; for example it could be forecasting and valuing the client’s company, this site
producing scenario analysis, or it may be simply undertaking sensitivity analysis. This article will focus on the value-adding power of a sensitivity analysis in a pricing model.

Unlike a more macro-approach of a client’s financial model, a pricing tool will compute expected financial returns of a company delivering a service; based on price, units of sale, and direct and indirect costs. Allocating indirect costs incrementally based on activity-based costing is often overlooked in price decision-making, when a service company is aiming for a specific return on sales.

The value of sensitivity analysis

The value-added benefit of a sensitivity analysis for a service company is the ability to compare a preliminary price, compared to hypothetical changes in certain key drivers; in order to compare expected financial returns. One example depicted in the screen following is the ability for the client to increase gross income from $131,212 to around $200,000, based on expanding course enrolments from 22 students up to 34 students using Solver.

Dashboard to compare sensitivity outputs against base case

Your financial model can go one step further. It can leverage its existing dashboard functionality and deliver further value to your client. A great example is a dashboard that can compare the sensitivity outputs against base case, which in this case is a service company’s base service price, sales volumes etc.  The following Waterfall Chart can demonstrate this. This model can generate such a chart in terms of Gross Profit, EBIT, Net Profit After Tax (NPAT) or Operating Cash Flow; in this instance the Gross Profit option has been selected to reveal the additional 12 students increases Gross Profit from $88,234.8 to $148,667.2.

The essence of sensitivity analysis is such that users may only want to get an understanding of how revenue or cost drivers influence returns in a financial model. The sensitivity analysis of a model should facilitate this user-friendly flexibility. The following enables users to understand how a mere 10% decrease in the Course Enrolment Fee, a 10% increase in the Course Curriculum Fee, and a 10% decrease in Students Enrolled; will generate small decreases in NPAT (from  $49,834.8 to $49,704.0) and Operating Cash Flow ($50,798.4 to $50,667.6).

Illustrate financial impact with Waterfall Charts

Again, the model can go one step further and illustrate its financial impact in a Waterfall Chart. The above example of Operating Cash Flow falling slightly from $50,798.4 to $50,667.6 due to changes in Course fees and Students Enrolled is depicted in the following.

How about if there is a 25% increase in Course Curriculum Fee per Student, assuming the other two changes remain constant; as you can see Operating Cash Flow moves materially from $50,798.4 up to $61,722.6.

The two different approaches, solver and arbitrary changes to specific drivers, highlight the benefits of a sensitivity analysis feature in a financial model. It provides an important tool for executives when they are faced with making financial decisions; either from the more micro aspect of pricing a specific service (as depicted), or more macro decisions such a divestment or a debt-raising.

Remember corporate executives are time poor and aren’t interested in trawling through pages and pages of worksheets. Sure they still need to understand the fundamentals via a high-level dashboard, but often they want to refer to a pictorial format such as a waterfall chart.

Concluding the insensitivity benefits of your financial model

The application of a sensitivity analysis is an integral feature of any type of financial model. It is a great way to turbo-charge the added-value that you can deliver to a client or inside your company. No financial model can merely regurgitate financial statements via an accounting system. It needs to deliver intrinsic value and push the envelope, in terms of offering management flexibility to operate the model and get a better sense of the revenue or cost drivers influencing financial returns to their company.
“There is no reason anyone would want a computer in their home.”
Ken Olson, pharmacy president, website
chairman and founder of Digital Equipment Corp., information pills
1977

“640KB ought to be enough for anyone”
Bill Gates, 1981

“I think there is a world market for about five computers.”
Thomas Watson, Chairman of IBM, 1958

Sometimes forecasting the future is fraught with difficulties. Even those entrepreneurs whom we place on a pedestal don’t always have the business foresight we assign to them. What can seem certain can turn out to be false. What can seem highly unlikely can become everyday.

Rather than go into a deep discussion about forecasting and modelling techniques, we wanted to remind you about some core principles. Lessons that at times can easily be forgotten, to the detriment of the quality and accuracy of your financial models.

Forecasting Fundamentals

1. Your forecast will always be wrong

Sometimes we forget this obvious truth. A forecast is just that, an estimate, a predicted future result. The question you should be asking, of course, is “how wrong is our forecast?”

2. Simple forecast methodologies often trump complex ones

There is danger in complexity. Forecasts that rely on complicated methods often hide key assumptions built into the model. When key assumptions are obscured it can lead to unexpected and hard to trace failures. On the other hand, simple forecast methods are easy to understand, to analyse and to work out why it went wrong.

Forecasting is Hard!

3. A correct forecast (or at least a highly accurate one) does not prove your forecast method is correct

It could have been chance. When you accurately project financial and other key performance indicators, it’s still important to check your methods. If you only question your methods when there is a large variance in the data, you’ll miss all those times your forecast was just lucky – potentially hiding a multitude of sins.

4. If you don’t use the data regularly, trust it less when forecasting

The quality of your data is proportional to the amount you use it. When information is not regularly used errors often remain undetected (and errors are common in financial models). Regular use of data helps identify mistakes and smooths out inconsistencies over time.
You’re usually better off using solid data and adding further assumptions than to work with rarely used data.

5. All trends will eventually end (that’s why they’re called trends)

Many factors will affect the pattern you’re trying to forecast. It doesn’t matter how accurately you predict the trend, in the future the variables will change and the forecast will be wrong.

6. It’s hard to eliminate bias, so most forecasts are biased

Let’s not forget we’re talking about predictions here. When you have to make a range of assumptions (which factors to include, how strongly to weight them etc.), it’s likely that you will be adding some bias to the forecast.

7. Large numbers are easy to forecast than small ones

It’s usually better to forecast the bigger number and work back the calculation to determine the component parts, than to forecast the component parts and then add them up to determine the bigger number.

8. Technology is not the solution to better forecasting

Robust forecasting comes from sound logic in your methodology. First, create an appropriate strategy and then use technology to make it more successful and efficient. Technology is not the answer … it’s the tool to make it better. So remember these core principles and you’ll be building your forecasts on solid foundations.

And try not to make the same mistake as Ken Olson, Bill Gates or Thomas Watson – being that wrong never makes you look good!
Financial model developers and relevant stakeholders spend a lot of time building a financial model. Unfortunately many financial models never fully yield enough value or relevance to a company. Even when immense time and resources are employed in the model development stage. Here are some of the reasons why your financial model has been consigned to the scrap heap.

Audit Risk

Model stakeholders, cialis especially executive management, don’t have adequate confidence or understanding of the financial model. The major reason is the esoteric and complex nature of the model, which has been over engineered without the use of a logical process flow across worksheets. Often the developers have tried to calculate numerous elements in one column, instead of breaking it down and calculating each element one at a time.

A prime example would be a model that uses nested IF statements to calculate outputs.

Unknown External Referencing

A common culprit with audit risk is the incorrect use of random cell referencing to external sources, such as a file from an email, a file saved to a personal C drive or desktop, or a third-party’s hard drive network. To mitigate against this problem, always save all files to your Company’s network and ring fence all external data referencing to designated Model Import worksheets in your financial model.

Cell Value Errors

This is related often to models referencing external files. The more non user-friendly a model is the more prevalent and harder it is to find the exact location of cell value errors. As mentioned, if cells reference only external values via model import sheets, and data entry is restricted to the Assumptions area of a model, it will go a long way to avoiding this issue.

A comprehensive rollout of Basic Error and Alert Checks across the entire model would more effectively identify such errors. Remember to cross check not just headline numbers from source financial statements on the Model Import sheets, but also calculated numbers such as operating cash flow from a) a direct cash flow and b) an indirect cash flow perspective.

No User Guide

Like anything in life, we humans sometimes underestimate how challenging (on the surface) our work is for others to understand. As much as a user guide is a drag and a bore, it is imperative to clearly articulate in writing with screen shots, the essence and mechanics of the financial model. The user guide must be comprehensive, straight to the point and written clearly.

One Gatekeeper

The model developer is the sole user, operator and custodian of the financial model. No one else in the company ever gets around or has the opportunity to manage the model, let alone understands how to maintain the model going forward. Hence the sole gatekeeper of the model leaves the company, and no one else can take over the reins.

It is paramount with all financials in your company, where executive management rely on it to make financial decisions such as a strategic plan, that there is a knowledge transfer to at least one or two other people in the organisation.

Poor layout

The commercial world is an ever-changing organism. You must remember this when building the financial model. No one can predict the future. Who knows, half the company might be hived off into a joint venture entity, numerous acquisitions may occur or there could be an internal corporate reorganisation. There must be adequate provision, flexibility and room allocated in the financial model to cater for material corporate changes. Do not be afraid to include an additional 24 (or another applicable number) business units, projects or assets into the Assumptions area of the financial model.

Hence significant corporate changes must result in a relatively seamless update to the model. Such significant changes must not jeopardise or compromise the model.

Use of static functionality

Although Pivot Tables are a powerful, value-adding and time-saving feature, by default they do not automatically update to reflect changes in source data. Herein lies the problem, assuming the model has not been assigned VBA code to ensure automatic updating, this is a major risk with models that rely upon Pivot Tables. This is something I discussed in an earlier article, The Dangers of PivotTables, array formulae and other non-seamless Excel features, which sparked debate among peer financial modelling professionals.

Similarly Ctrl+Shift+Enter (CSE formulae) or Array formulae suffer from a similar issue and pose a risk to a model. A CSE formula can indeed perform the calculation of numerous formulae, but whenever the source cells change, the CSE formula will need to be updated manually or via customised VBA code.

Scrap conclusions

There are many reasons why your financial model can be at risk of the scrap heap. Unfortunately some reasons are often outside your control or as a result of company politics. However there are measures to consider when building a model, which may help to extend the longevity of your financial model going forward.

Keep the model simple, clear and concise, in order to ensure it is audit friendly. Restrict all external referencing to data that has been placed into the Model Import worksheets.  Safeguard against all cell value errors in the model, through the use of Error and Alert Checks. Take the time to write up a user guide with the thorough use of screen shots to better explain the model. Avoid only one person being the builder, user and manager of the model.

Make certain of an aesthetic layout with provision to add new business units or other entities in the future. Avoid the use of static functionality such as Pivot Tables (if so, make sure to attach VBA code) and CSE formulae (ditto, please use VBA code).

Such measures will help you to realise model user flexibility, user friendliness and user transparency as per the auspices of SMART best practice modelling methodology, which states the importance of minimising model risk, enhanced model stakeholder confidence, heightened modelling productivity, and the ability to undertake a model audit in an efficient manner.

Recent posts by Simon Selkrig

Comments for “Reasons why your financial model has been consigned to the scrap heap”

  1. Dave Brett says:

    A very good article Simon

    The potential model issues (those above and some others I have added) can be broadly bucketed into two categories:

    1) Model errors / potential errors which break the faith of other users/management (errors, problematic links, inappropriate use of array formulae, Data Tables, circular references and Pivot Tables)
    2) Poor model design (user guide, long calculation times, significant use of VBA driving the underlying model engine, poor layout etc)

    A further bucket I would add is ongoing model management (which “single gatekeeper” is part of)

    3) Model management (archiving, reconciliation, storage process, process to implement and share model architecture updates, publishing model “releases” to other users, avoiding model bloat, user induction and training)

    I believe that Links if used appropriately improve the audit-ability of a model by clearing showing data sources. When working with a model suite, some basic VBA is very handy in checking that relative link references hold on model open (ie feeder models reside in the appropriate directories relative to the main model).

    From my perspective the general standard of financial models has improved enormously over the last 10 years, driven partially in response to modelling horror stores (EUSPRIF), and also by the widespread sharing and publication of best practice techniques online. The one area that still seems problematic is people working with Excel mega-models with huge lookup functionality which can be difficult to audit and to quickly calculate – an application perhaps better suited to a Database then Excel.

  2. John McNeill says:

    The key to avoiding these common errors is maintaining a discliplined approach to modeling. Linking a model to another file is immensely useful; however, it’s a landmine. When the other file is deleted, corrupted, moved, etc., a model can be made useless or even dangerous, without any real warning. If you link to another file, that file should be controlled and monitored.

    Likewise Pivot Tables — I prefer not to use these at all as part of a model, but rather to build them out on each use, either manually or using VBA. They’re very powerful, but they create the potential for error without seeming to.

    An instruction page (or user guide), a source list, and a revision log are mandatory parts of any model. In my opinion, failure to have them is the hallmark of an undisciplined analyst.

  3. […] order to replicate the executive reporting requirements of the prior periods. As I discussed in “Reasons why your financial model has been consigned to the scrap heap”, such a model will be rendered as redundant by the company in due course […]

Comment on this Article