Training the Automated Auditor
Presented by Carolyn J. Newman, CPA, CISA
President of Audimation Services, Inc.
Contact Carolyn at: CarolynN@audimation.com 
Thursday, August 16, 2001
Visit the AccountingWEB Workshop Calendar  for upcoming sessions.
Your firm has recognized the benefits of moving towards paperless audits and automating the audit, you've invested in software to help achieve the objective, but the results have been less than expected. This workshop explored the best approaches to training the automated auditor. "There's a psychology involved," according to Carolyn Newman, president of Audimation Services, Inc. Ms. Newman shared tips and best practices observed in firms as they work to achieve higher levels of productivity through technology by using the right kind and frequency of training.
Key points included:
- Implementation planning issues
- Firm (as in strict) policies
- How to Bridge the gap between learning the new skill/software and applying it effectively on the job
- On-site vs public training classes
- Optimal class size and length
- Alternative training methods - self-study, web-based, etc.
- After training follow-up
- Developing an in-house specialist
Session Moderator: Thank you all for joining us today, we are pleased to welcome Carolyn Newman of Auditmation Services Inc. to the AccountingWEB Workshop Series. Carolyn has a public accounting practice in Houston, Texas, established in 1990. In 1992, she founded Audimation Services, Inc., to distribute IDEA and other audit automation software to internal auditors and government. She is a former Regional Technical Director of EDP Auditing for an international accounting firm, where her duties included developing EDP Auditing training courses. She is a frequent speaker on the subject of computer assisted audit techniques.
Today's session will cover how your firm can recognize the benefits of moving towards paperless audits and automating the audit. The session will explore the best approaches to training the automated auditor. "There's a psychology involved," says Carolyn Newman:, President of Audimation Services, Inc. Newman: will share tips and best practices observed in firms as they work to achieve higher levels of productivity through technology by using the right kind and frequency of training."
During this session you will learn -
Implementation planning issues
How to Bridge the gap between learning the new skill/software and applying it effectively on the job
On-site vs. public training classes
Optimal class size and length
Alternative training methods - self-study, web-based, etc.
After training follow-up
Developing an in-house specialist; and
Session Moderator: Whew! Welcome Carolyn, the floor is yours!
Carolyn Newman: Thanks! Hello everyone!
Carolyn Newman: In the summer of 1998, the Journal of Accountancy ran an article entitled, "Take my Manual Audit . . . Please!" Three years later, we're just getting started with true audit automation. Technology is recognized as the means to become more efficient and effective in the marketplace. But I think the efficiencies are often offset by poor or incomplete usage of the software products firms acquire.
Does everyone know the statistic about the percent of features in Word and Excel that are actually used? What's true for the common software products is also true for audit automation tools.
Ted Newhouse: Probably less than 20%
Carolyn Newman: Ted, that's close -- 10%
Carolyn Newman: Software that automates the auditor includes everything from email systems to paperless work paper products, data analysis software and now the latest offerings, client relationship management tools. A recent article in Accounting Today, by Gary Boomer, notes that the best way to implement technology and to keep productivity high is to properly plan the acquisitions, get appropriate training and utilize a training coordinator.
A training coordinator should be able to develop a curriculum and work with the firm network administrator or IT director in all product rollouts and updates. We've worked with a number of firms to help them implement IDEA, the data analysis software. You wouldn't believe how much faster and more thoroughly some things can be done in IDEA that are still being done in a spreadsheet. I know I'm dating myself, but it reminds me of how it was when spreadsheets first became available. The timesavings were sometimes obvious, but it was always easier to pull out the four- or 10-column pad.
Kerry Thomas: Why do you suppose there's such a lag in interest in IDEA?
Carolyn Newman: You can train everyone in how to use a spreadsheet, but if they don't understand the benefits, and if they are not clear on the firm's commitment to its use, the learning will not occur.
Kerry, I think the answer has to do with a lack of willingness to change the process. In addition, more people need to know it's out there and really quite user friendly. Auditors will go back to the way they know, even though they might have had extensive training in how to use the new software.
Carolyn Newman: Because IDEA is a tool that's used in the field on certain parts of the audit; one of the training challenges is caused when it is not one that initially goes out to every staff member. Another challenge relates to the changes in how companies can be audited when you have the power of data analysis tools. This is complicated by the fact that a high comfort level exists with Excel and other spreadsheets.
Carolyn Newman: Do any of these matters sound familiar to you? Questions or comments so far?
Megan Majewski: What is IDEA
Andy Armando: A couple people in our firm are familiar with IDEA, but you're right, not many.
Carolyn Newman: IDEA stands for Interactive Data Extraction and Analysis. It's used to import data from any system to foot, recalculate, sample and analyze the information electronically, without having to learn programming.
Carolyn Newman: Andy, does your firm have IDEA?
hernst1: People don't want to leave their comfort level to explore because of time constraints and chargeability.
Andy Armando: No, we don’t.
Carolyn Newman: hernst1 - I agree. You have to have a plan. We'll be covering that soon. This workshop explores the best approaches to training the automated auditor. We'll cover ways to eliminate the disconnect that occurs between the training and successful use of software in the field. A common example is when you find out the staff had a problem importing a trial balance into the work paper software. The ?work around? was to key in the balances.
Did it get the job done? You bet. Did the staff person effectively learn how to deal with data access and import issues? Not that time. What's needed is a set of policies and procedures that insures the effective use of tools in the field. One firm takes away their crutches, while another firm documents the different file types that will be encountered, and makes available a help desk service to get the users through any rough spots.
We'll be using a little psychology and discussing some of our own suggestions, as well as divulging some best practices we've observed around the country at firms that are getting the most out of technology.
Psychology? Of course! After all, every human being really hates change. If you get a chance to read ?Who Moved My Cheese,? do!. You'll see what I mean. A study discussed in ?High Performance Training? by psychologist, Bob Davies notes there is a missing link between training provided and actual work performance. Davies states that, ?the common denominator among high achievers is that they have pre-set outcomes set in advance with some method of support and reinforcement.? THEREFORE, successful implementation of audit automation tools should include a process that not only provides the training needed, but helps to create an environment that will turn all users into stars! Here's what we recommend:
Planning the implementation. Some firms start with only a few copies of work paper software and even fewer copies of data extraction software. I call this the ?toe in the water? approach to automated auditing.
Firms call it safe if for some reason the pilot doesn't yield the results they expect, they've minimized their expenditure. But let's look at the pure psychology of that approach. Yes, they've avoided the big ?C? (commitment).
Yes, they've spent less than the cost for a full roll-out. And yes, unfortunately, they've sent a message to the personnel that we can always go back if this doesn't work out. People hate change. They will find any excuse to go back to what's known and comfortable. Part of the implementation of new software and technologies MUST include communication of the buy-in to this process by the management group.
Have any participants experienced an attempted software solution, then ended up going back to the old way?
hernst1: And modeling is sometimes more important than communicating.
Carolyn Newman: I agree with you, hernst1 -- when we get to the coaching part, I'd like more of your ideas. There should be policies and procedures outlining the standards and expectations for proper use of the software. The firm I mentioned earlier who ?takes away the crutches? Actually removed ATB from computers after CaseWare paperless work papers was adopted. It required all personnel to have the same training in all the PC-based software they owned, AND had it put policies and procedures in place for the housekeeping and other issues related to client files, etc.
Another firm publishes a chart in their firm audit manual that indicates when it's best to use Excel, when to use IDEA and when to use Access, and another firm requires all client prepared reports that are more than three pages be provided electronically. These are policy and procedure examples that get the message across . this technology is here to stay. It will be used.
Remember, too, that what gets measured gets done. Include ?effectiveness in using technology? on employee evaluation forms as a key performance indicator. Require summarization in the planning section of each set of work papers that documents the tools to be used, the extent of automation and expected timesavings for the audit. The policies adopted in connection with implementing new software should include pre-set outcomes and expectations. Part of this process could provide the partners and senior managers with overview training and a brainstorming session so they cans agree on the measurable things that will be established within firm policies.
Another approach might be to form a focus group comprised of users who already have a strong understanding of what they can accomplish with the data analysis tool. The focus group would develop the processing standards to be followed and determine how best to incorporate these changes into the organization's process. This is the most critical step in the implementation process. Conformity cannot be achieved without standards to be followed. This is also the most difficult stage of the process because of the time and commitment required from individuals with the knowledge and ability to make these decisions. We have assisted some of our IDEA clients with this process.
As firms get closer and closer to a paperless audit approach, setting the standards and guidelines for what gets imported versus what gets scanned, naming conventions for files, etc. will be critical.
Carolyn Newman: Have any of you gone paperless yet?
Megan Majewski: Yes
Carolyn Newman: Will you share your experience?
Megan Majewski: We are using CaseWare..since January...all audit and tax workpapers are paperless.
Carolyn Newman: Has your paperless process included data analysis?
Megan Majewski: Our experiences have been positive...we just started using CaseWare in January. No, only our workpapers are paperless. We have not conformed to a data analysis system.
Carolyn Newman: Probably the most effective program we've seen a firm implement is this: (they use CaseWare, too) ALL employees are required to attend CUSTOM training classes for ALL software products used in the organization. This includes products such as Word and Excel, even if the employee believes they already know the product.
Megan Majewski: We did that too.
Michael Awad: That would be ideal.
Carolyn Newman: The purpose is to make sure that all employees are at the same knowledge level and that standardized company procedures are incorporated into the class. The purpose is to make sure that all employees are at the same knowledge level and that standardized company procedures are incorporated into the class.
hernst1: what if the employee really does know the product?
Michael Awad: That's the ideal scenario, but in cases where directors are reluctant to commit all that staff time, we've found that it's a good idea to at least get the project team to attend the custom training and then TAKE OWNERSHIP of the class that teaches the rest of the users.
Megan Majewski: We thought ours did, until everyone was required to take a standard test to determine there ability. Most were well below where they thought they were.
Carolyn Newman: hernst1, there are many different ways to use programs. The procedures and standards are the main thing here. I like that take ownership suggestion. It should be rewarded well, don't you think?
Michael Awad: Not sure what you mean by rewarded.
hernst1: Megan, was the test question or task based?
Megan Majewski: test question.
Carolyn Newman: Megan, your comment is right on. The personnel firms use computer-based tests for this purpose. I have an example or two in this presentation. Let's continue. The more sophisticated the software, the more important training becomes. You can get product training from most vendors, and there are a few consultants who are qualified to conduct this training.
The best instructors are those who have experience in hands-on training plus experience in real world applications of the product. It also helps if they have a sense of humor. On-site training is almost always better than public training, because the group can be open about the particular approaches, challenges and clients that will be encountered within the firm. Public training has its place because it allows people who missed the in-house training to get the same level of knowledge. It also lets participants meet and compare notes with auditors and analysts from other firms.
This really happened at a semipublic training class, we conducted a while back: A group of auditors show up for training from a small CPA firm. They do not know Windows. They had never used a mouse. They do not even know how to copy and paste the training files to their hard drive. It turns out they were given the computers two weeks before the IDEA training and were told they needed to start using them. No one had informed them about our stated prerequisite that a working knowledge of Windows and Excel or other spreadsheet software was necessary.
The partner and senior manager training should be conducted first, followed by a workshop that will take a typical engagement and incorporate the use of the tools during each appropriate phase of the audit. Once management understands how data analysis can be used to reduce time while improving coverage on an audit, decisions must be made about revisions that should be made to the audit programs. PPC's Guide to Risk-Based audits suggests that data extraction and analysis can result in more efficient audits. But the references to use of software to perform tests and analysis are "by the way" references found in the notes of the program guides.
What's needed is the actual incorporation of these tests into the audit program. PPC's Guide to Data Extraction Software has examples of such programs. If you don't know about IDEA, that's a good place to start. After this is done, hands-on training for personnel who will be using the software should be provided. First, they need to know they're expected to use it on every audit with a budget of X (whatever the firm management decides), where the client has computerized accounting.
This part of the training should be conducted by someone within the firm who can communicate this and other policies that were developed after the initial training of the management group. The enthusiasm and clear evidence of commitment to the new technology and all changes in procedures must come through loud and clear. Next, they need to know how to get the files or that someone will be available to help them get data electronically. For trial balance and paperless work paper software, the formats the software will accept and the ways you can convert what you get into those formats should be addressed. For data extraction software, a decision on when to get raw data files vs. print reports should be thoroughly discussed and understood. If it is determined, that personnel also need training on the basics of data processing or they need other IT-audit knowledge, it should also be provided. Because this will be hands-on training, the student to instructor ratio should be no more than ten to one. We do annual ?super-user? training for a large firm.
This firm provides assistant instructors from their base of experienced users to ensure that enough eyes are watching the students and everyone masters class exercises and learning objectives. The case studies are real world based, and the firm also posts a follow up case study for intermediate learning (in a self-study mode) on its intranet. Training on how to use software must cover the basic functionality of the program. But mastering the basics doesn't make you an effective user in the field. That takes practice. It takes having an understanding of what to do, not just how to do it. It takes a level of confidence that, if something doesn't work ?like in the class,? there will be enough help available to figure out the difference and keep on meeting the objective.
There absolutely MUST be a system of support and reinforcement. During the initial training class, it might be a good idea to have the participants identify a "buddy" who they can call for reinforcement. This person would be more of an encourager than a trainer. You should avoid what Gary Boomer calls the ?random walk? method of follow up training. That's where employee A walks over to B's desk and trains him or her in a specific solution for a few minutes. Then B can train C and so on. The risk here is that good and bad habits, rather than firm standards are taught.
I'll never forget the time I received by email a staff person's time summary (done in Excel). The totals and cross-totals had been entered as amounts!! Doesn't the basic level of training cover using a formula in Excel? And, yes, their resume said they had knowledge of Excel. The A's and B's who like to help out should be identified and encouraged to ?mentor? other users. Mentoring or coaching is an effective way to ensure that the gains in productivity, the primary benefit of technology, will continue. As a formal program, it can really benefit the efficient use of software. Later we'll review some things to consider when developing in-house specialists.
Carolyn Newman: Does anyone have in-house specialists used this way?
Megan Majewski: Yes, we have a group of Power Users...we meet monthly to go over issues and address them.
Michael Awad: Yes!
Carolyn Newman: Megan, that's great. They aren't referred to as PU's are they?
Megan Majewski: Actually....PUGs...Power User Group....Not much better though.
Carolyn Newman: I hear you. Other methods for training. Instructor-led training is more expensive, but also more effective. Some firms have ?train-the-trainer? programs, and designate an individual to do training and light technical support.
Remember that not all instructors are created equal. The individual must be able to communicate effectively, and be able to illustrate the points with real world examples, plus follow the body language and clues from participants who might not be getting it. After hands-on training, a good instructor will be able to tell you who is most likely to take the bull by the horns on this technology and who will struggle or give up in the field. You should ask for that information
Computer-based and web-based training are available. I think we'll see more and more of that. Has anyone participating today taken computer-based training for use of software? Come on. Y'all are web prowlers! Surely, you've tried computer-based training!
hernst: Yes we have. It seems that it is difficult to keep up with it because it is so easy to quit.
Carolyn Newman: hernst. I know what you mean.
Carolyn Newman: Jill Davies in my office learned Visual Basic that way. It gave her enough knowledge about programming and VB in particular to be able to write some powerful scripts to further automate the use of IDEA. This is a funny story. Two years ago, we decided our tech support/help desk service should be available during busy season. Jill was one of the ones who came in every Saturday in January, February and March to help people with their IDEA questions and any problem files they might have.
In January, Jill had the perfect environment for computer-based learning - no interruptions. However, this is also a sad story, because I know there were people who should have called in January or before when the time pressure was not so serious. Calling for help is so important when you're using software for the first several times.
Carolyn Newman: When we conduct the basic IDEA training, we always emphasize the importance of calling for help. The only dumb question, you know, is the one that isn't asked! On-line seminars are becoming popular. ePACE has been marketing alternative training that way. I tried an on-line seminar for the Intuit Advantage program. Here's the risk I see for that type of training. The participant must control his or her environment during the training so there will be no distractions.
There is no effective way to get feedback from the instructor, and the instructor tends to be feature-knowledgeable, but not practical use knowledgeable. Even so, I think we'll see a major increase in this type of training. If bandwidth continues to improve, it could even become more interactive and participatory. Developing an in-house specialist. The larger the firm, the more in-house specialists will be needed.
Some of the larger firms actually do a disservice to staff-level auditors by making the use of CAATs (computer assisted audit techniques) a specialist-only activity. Data extraction and analysis is fun. Yes, there should be specialists, but those individuals are not the ones with in-depth knowledge about that engagement and the client's control and reporting systems. They didn't do the risk assessment or design the tests and determine audit scopes. In-house specialists should be technology-fluent. They should be those who are not intimidated by client claims that it's not possible to get the data. They should be able to get the data, define it and pass it back to the audit team.
An in-house specialist can help bring together common file and data types encountered into a sort of knowledge base that can be made available to all users.
hernst: Carolyn you mentioned cbt tests.
hernst: thank you.
Carolyn Newman: hernst -- yes. I don't know the name but I have a client in the temp business that I can get the information from. They can be proactive with the users in the field to ensure that they follow through on the plan. Hopefully, they will catch users in time to avoid nonproductive actions and any ?wheel-spinning.? In effect, these people can be CAAT and audit efficiency coaches.
If you want to increase productivity, training is the absolute answer. But please make sure the training is more than just features training.
Michael Awad: I agree that computer-based training carries with it the risk that the trainee won't properly control for distractions. We, as a software company, encourage people to go through the CBT together, so that they can keep each other accountable not to let distractions interfere.
Carolyn Newman: Michael, you mean like a guided tour? That's how the Intuit seminar I took was. It was not good for me because of too little interaction from the leader.
Michael Awad: I do mean a guided tour, yes. When we do web-based training, we try to ask questions to spur discussion among the participants so that they can decide what standards they want to implement for the rest of the end-users. The key is to have a skilled moderator.
Carolyn Newman: Bridge that gap between the knowledge of the product and the efficient use of it in the field by continuing to train, include pre-set outcomes and expectations, and make your environment one that offers support and reinforcement to each individual who uses the new technology. I think this will really grow as the bandwidth improves. And YES - moderating successfully is an acquired skill.
Session Moderator: Are there any questions for Carolyn before we finish up?
Kerry Thomas: No thanks - it's been great.
Session Moderator: I would like to thank you all for joining us today. I would also like to thank Carolyn Newman: for this informative session. Great job Carolyn!
Ted Newhouse: This has been very helpful - thank you
Michael Awad: Thanks a lot, Carolyn. Very helpful for us.
Carolyn Mewman has a public accounting practice in Houston, Texas, established in 1990. In 1992, she founded Audimation Services, Inc., to distribute IDEA and other audit automation software to internal auditors and government. She is a former Regional Technical Director of EDP Auditing for an international accounting firm, where her duties included developing EDP Auditing training courses. She is a frequent speaker on the subject of computer assisted audit techniques.
Ms. Newman holds a Bachelor of Arts degree from Texas Lutheran College, and a Master of Accountancy from the University of Houston.
In June 2001 the Texas Society of CPAs honored her as a CPA Pathfinder. She is a member of the AICPA, Texas Society of CPAs, Information Systems Audit & Control Association (ISACA), The Institute of Internal Auditors, National Association of Corporate Directors, and The Association of Government Accountants. She chairs the Government accounting and auditing committee of the Houston Chapter of TSCPA and is treasurer of St. Cuthbert Episcopal Church.