Spreadsheet error research: Wasted time worse than mistakes

Sift Media
Share this content

The time wasted working on bad spreadsheets is a bigger practical problem than errors within Excel models, a leading spreadsheet error expert concluded at the recent European Spreadsheet Risks Interest Group (EuSpRIG) conference in London.

Presenting the results of a detailed study of errors in 25 sample spreadsheets, Stephen Powell from the Tuck Business School at Dartmouth College in New Hampshire found that 15 workbooks contained a total of 117 errors.

However, 40% those errors had little significant impact, the research found.

"Spreadsheets are full of data that's never used and errors that go nowhere," Powell said. "They are not necessarily systems with inputs that get processed into outputs."

On the other hand, seven of the errors uncovered were estimated to have cost impacts ranging from $4 million to $110 million.

"There were some enormous errors there," he said.

To conduct the study, Powell and his colleagues Barry Lawson and Kenneth Baker cajoled former students in five different organizations into letting the team examine five representative spreadsheets. The researches spent from three to 12 hours examining the worksheets for errors and agreed their assessments with the people who supplied them. The distinction between errors and poor practice was a hazy one, Powell noted.

The study identified a spectrum of error rates; and even within organizations, spreadsheet practice can range from excellent to poor, it noted. One organization, a consultancy, supplied five spreadsheets which passed the assessment with no documented errors. At the other end of the scale, four out of five spreadsheets from a charity participant in the study contained a total of 19 errors, which had a maximum impact value estimated at $98 million.

The consultancy may have carefully cherry-picked good quality spreadsheets, Powell said, but it also had a culture of excellence rather than strict rules on spreadsheet usage - and lots of training.

"I have come not to believe in the average of errors," he said. "My hypothesis is that the time wasted working with awkward spreadsheets is a more significant practical problem than errors."

The Tuck team plans to explore this issue further, he added.

Powell's detailed study was accompanied by a companion presentation from University of Hawaii professor Ray Panko, the doyen of spreadsheet error research. After more than 25 years looking at the subject, he quipped: "I was not the first person to do spreadsheets research. I was the one who didn't have the imagination to get out of it.

"I would love to get past spreadsheet error, but when I look at human error research, we as a community don't understand it as we need to and are believing things that are not true. We have to understand what human error is all about."

The foundation of Panko's latest theory was that "thinking is bad". He explained: "Every time you think you're likely to make an error. When you think, there's a small chance you'll make an error. And when you create an Excel formula, that chance is not small."

While studies such as Powell, Lawson and Baker's provide ever more conclusive evidence of human fallibility, Panko's research has convinced him that error rates are highly predictable. "Error rates are all the same across various fields," he claimed, citing linguistics, writing, insurance actuarial figures and industrial and nuclear accident rates.

"The great realization was that error rates were all the same," he said. "In programming, error rates run between 2% and 5%. And he found that the people who try to detect errors were even worse at the task than the people who did the original work.

"Human beings are empirically not very good at finding errors," he said. "And automated tools are good at the things we're good at, but worse than we are at the things we're not good at."

When he sees people or products claiming to make spreadsheets error-free, Panko cringes. "There are no silver bullets for reducing error. You have to do everything well - and when you finish, you cut errors in half."

To try and reduce errors by 90% means investing 30%-40% of your development on testing. "If you are not doing testing, you are in real problems," he added.

On the premise that human thinking processes are inherently bad, Panko suggested, "What you want to do is think less. Use a spreadsheet rather than do things by hand - spreadsheets have really reduced errors, they haven't increased them. But to get out of spreadsheets, try to get packaged solution," he said.

In reality, users will continue to use spreadsheets and to think, he admitted. "You can't eliminate it and thinking is good, after all," Panko said.

By John Stokdyk for our sister site, accountingweb.co.uk

About admin


Please login or register to join the discussion.

There are currently no replies, be the first to post a reply.