How Safe Is Excel?
What would life be like without Excel? For most accountants it’s the most important piece of software they have - very easy to use, but incredibly versatile.
Every so often, however, we hear horror stories about Excel disasters. The doom-mongers mutter darkly that this very strength – its ease of use - is Excel’s weakness. So many spreadsheets out there have simply been cobbled together by users as they go along, with no proper discipline behind the design or the data structure. And indeed we’ve all seen a forecast where somehow one of the figures has got wrongly calculated and the error has flowed right through to the bottom line.
So just how safe is Excel? The question is particularly relevant when the next version of Excel will be able to hold 20 times more data in a worksheet than it does now. Will this just mean bigger and bigger disasters?
To answer the question, we first need to think what Excel actually does, because over the 15 years of its life Excel has developed into a composite of three different applications.
- Excel is a Spreadsheet
After the PC arrived in the early 80s, the biggest selling application to run on it was the Lotus 1-2-3 spreadsheet. Accountants used Lotus for financial modelling and forecasting; non-accountants for more humble but very useful tasks such as printing lists.
- Excel is a Database
In the early 90s Microsoft introduced Excel. In the ensuing “feature wars” against 1-2-3, Microsoft realised that the row and columns design of a spreadsheet exactly mirrors the records and fields design of a database. So they enhanced Excel by adding features up till then found only in database packages – features such as Autofilter, Sort, Sub-Total and Pivot Tables (“cross-tabs”, as they are known in the trade). Excel became a personal database as well as a spreadsheet.
- Excel is a Data Analysis and Reporting Tool
Finally, with the advent of Windows and WYSIWYG (“what you see is what you get”) accountants realized that if they could get their data into Excel, they could manipulate it, then format it into a final report good enough to distribute to senior management. Using the Import Wizard they could take data from somewhere else, then analyse and reformat it in Excel to produce management reports.
A spreadsheet for financial modelling, a database for storing records, a tool for analysing and reporting on your company data – Excel is all of these.
Where things go wrong in Excel
Most Excel disasters involve large spreadsheets which have been continually added to until they get out of control. So can we conclude that big spreadsheets are bad spreadsheets?
Not necessarily. Suppose, for example, that you import a million records from your ERP/accounts package into Excel and analyse them with pivot tables. This would be perfectly safe. The input data has come from the accounts program where it is stored under proper control, while the output reports are automatically calculated by the program, and the results are always correct. (You may interpret them incorrectly of course, but arithmetically they will always be right).
How many formulas will you be using?
So it’s perfectly safe to have monster-size spreadsheets when you are importing external data into Excel, then using pivot tables to analyse and report on it.
Where things start to go wrong is where there is a lot of data in the spreadsheet, and the user is using Excel formulas to manipulate it. Quite often I’ve been to companies whose accountant has left. Behind him remains a spreadsheet containing all their costings, pricing, forecasts, whatever. The information is vital and as an “Excel expert” I’m supposed to be able to come in and make sense of it.
I dread these spreadsheets. Often there are formulas all over the place. Changing one number here will change a dozen numbers somewhere else. It’s impossible to work out the logic of how it works at all, let alone whether it is working properly. At the end of the day all you can do is scrap it and start again.
A lot of formulas = bad design
At one level a formula can be a brilliant example of human ingenuity. But at another it indicates a failure in design – a short-term solution when the real need is to structure the data in a proper way for the long-term.
So, for example, at the risk of offending some readers, I would argue that any spreadsheet that uses the SUMIF formula to calculate totals has been badly designed. It is more reliable to use a pivot table for these calculations.
Excel and development
But a lot of the time you simply have to use formulas. If you want a job to be computerised, IT developers have this irritating habit of demanding a complete specification before they begin to write a line of code. But often at the beginning you only know vaguely that something needs to be done, without having any clear idea of what the final result is going to look like. So you just sling the data into Excel, and use formulas to work out the answers as you go along.
In the real world this suck it and see approach is often the only way to get any new system off the ground. But in principle, of course, the developers are right. What should happen is that, after a few weeks or months running the job in Excel, the application matures and you’ve now got a clear idea in your head of the logic of it all.
Having by trial and error got yourself to the stage where you have a full understanding of the job, you can now decide on the next step. Is it simple enough to leave in Excel? Or is it too big for Excel, with too much data being strung together by too many formulas?
If the latter, it’s time to go out and search for a third party package that will do the job, or go to an IT professional, give them a spec, and get them to re-write the application properly in something like Access.
It’s when this second step is not taken, and prototype systems based on Excel are simply left to get bigger and bigger, that disasters tend to occur.
This article was originally published on AccountingWEB UK, our sister site. It can be found in the ExcelZone at www.accountingweb.co.uk