Year 3 – 1990 – Data analytics established

Welcome to the 1990’s, the age of personal computers, distributed processing and the promise of great things from technology – sound familiar?

About six months ago, my manager questioned me about the fact that I had provided reports to, and performed analysis for, other audit teams. He said, “You were hired as an IT auditor, not to support other audit teams.” At the same time, he recognized that the audits I helped – were producing excellent audit results and that the data analysis and reports I provided were a contributing factor. He told me that he wanted the weekend to think about what to do with me. On Monday, when I was summoned to his office, I wasn’t sure if I would still have a job or not. I was fortunate, for a trial period I was given responsibility of supporting analytics. We had another discussion a year later when he wanted to know why the audit teams were not self-sufficient after being supported by me for a year. I was teaching them lots of things about the data and ACL, but they still needed help with more complex analysis. I explained that as long as I kept learning – and doing analysis 12 hours a day – the auditors would never catch-up to me and would always need help as we did more and more complex analysis. The trial period ended and a permanent data analysis function was created. A year later, the team of analytics experts had grown from one (me), to two; and we are supporting 100 auditors and 30 evaluators (mainly the auditors) who were performing about 20 audits and seven evaluations each year.

Our audit shop had purchased microcomputers (Wang) connected via token ring. As part of my analysis duties, I was responsible for a project which replaced this with an Ethernet network with IBM compatible PCs. Fortunately, I had a guy who was great with technology and money to hire a consultant to help us. The updated network had 10MB to the desktop and a DOS operating system which expanded the range of software we could use. Our first purchase was a three-user license of ACL / MVS (mainframe version) and 10 standalone licenses for the DOS version of ACL. Using the MVS version, I had direct access to our inventory system (IMS database with over 1M items at the top level node and 32 layers deep); to an extract file containing all transactions from the legacy finance system; and direct access to the HR application’s data files.

I had several options for obtaining mainframe data. I could run standard reports and download them to the PC; use IBM utilities (such as IEBGENER); or use the MVS version of ACL to extract data from the application systems data files or to access extracted data files. The resulting data files were downloaded to the PC where I ran my analysis with the DOS version of ACL (version 2.1 – if I remember correctly). Using an IRMA card and Kermit (3270 emulation software) I could download 1Mb per hour – if I was lucky. Usually, with what were then considered to be large files (5Mb+), it would take a couple of attempts (over several days) because of CRCs (cyclical redundancy checks) resulting from problems with the telecommunications connection and software. This may sound terribly slow given that the same download could be accomplished in seconds today, but I was in analytics heaven. I finally had access to data and the tools to perform any analysis I wanted.

We were supporting about 80% of the audits – primarily during the conduct phase. However, we were already discussing among the CAATTs team (of two) how we could be doing more to identify control weaknesses during the planning phase. Audit, in general, was also starting to consider risk when determining which audits to perform and what to audit – not just control weaknesses. The following highlights some analysis examples performed in these early days.

For a travel audit, I analyzed travel agency charges. The company had changed travel agents about 16 months earlier and the business case expected a $300K savings/year over the previous travel booking system. However, when the travel agency invoice arrived for fiscal 1989-90, it was $200K more than expected. Internal audit was asked to take a look and I was assigned to help. I convinced the financial auditor to get the detailed transactions supporting the agency’s invoice. The agency refused to provide the back-up until we highlighted a “right to audit” clause in the contract. They then tried to provide the details in printed form, but again we were able to refer to the contract which stated that we were entitled to review the system and data files.

It took three weeks of negotiation and fighting, but we finally got the data. A quick total of the detailed transactions agreed with the invoice. But with ACL I was able to perform additional analysis. According to the contract, the agency would bill us $2.35 per travel booking and $1.75 for any changes to existing bookings. The data included: the booking date, the booking change date, the transaction charge, as well as travel date, origin and destination information. I performed various analyses to check the validity of the booking/change dates to ensure they were all in the previous fiscal year; for possible duplicates and for inappropriate/ inaccurate charges.

Once the data was received, which took three weeks, the analysis only took two hours and the total savings was $160K. Sorting the file on the Date revealed 30,000 transactions which were either from the previous fiscal year (1988-89) or the current fiscal year (1990-91) and should not have been included on the invoice. The total overcharge was $63K. The duplicates command found a complete month worth of transactions that had been recorded twice – the overcharge was $85K. Isolating the booking change records, I noticed that 20,000 of them had a $2.35 charge instead of $1.75. The overcharge was $12K.

For an audit of employee moves, we examined the system used by the moving company to billing us for approximately 2,000 moves each year. We used parallel simulation to re-perform the analysis of the moving company’s system using the input data. There was a large discrepancy in the total amount of insurance charges. The insurance charge was based on the total weight of the household goods being moved (roughly $0.16 per pound). Comparing the results, we determined that the company’s calculation of the weight of household goods erroneously included the weight of the car. We determined the amount of overcharge to be almost $320K for the current year. The company agreed to refund $950K for 3-years worth of billing.


Lessons-learned: Even limited capabilities (team of two) and poor data transfer speeds (1 Mb/hr) can produce significant savings. Also, don’t trust the information contained in information systems (e.g. billed for transactions not in fiscal year, duplicates, and inaccurate processing). I have often encountered standard reports that were either inaccurate (incorrect criteria applied) or did not provide the information they claimed to provide – sometimes both. I have also found errors in user developed systems and spreadsheets. IT controls over input, processing and output need to be checked and re-checked.

During the first year of internal audit having an analytics team, we were able to obtain access to the data – first to the financial system, then inventory and finally HR – by being prepared. Over the years I have heard the following arguments almost every time I tried to get access to a new system or data:

  •  You do not have authority to access the data;
  •  You do not have the ability to understand or analyze the data;
  •  You can’t provide adequate security to protect the data from unauthorized use;
  •  The data contains personal information – so you can’t have it.

Unless you are prepared to respond to these (and other) questions, to discuss alternatives to direct access to applications – such as the use of extracts or backup data, you will not be successful in getting the data you need. Ensure that you have an audit charter that explicitly gives you access to any and all information, including electronic information.

Lastly, nothing succeeds like success. Because of early successes, my management supported the use of analytics and many of the auditors were on board. Therefore, plan your first attempts carefully; and choose audits where you have the highest chance of success. And don’t be afraid to toot your own horn – you have to let management (and the auditors) know what you are doing; when you succeed; and even when you fail.   Non-technical users have a tendency to think things are hard (or impossible) when they are not; and/or to think things are easy when they are not. You need to help them correct these misconceptions.

This article has 5 Comments

  1. Thanks Dave for the next interesting part of an amazing story about your job life and analytics. I have finished your books and started reading your blog. I’m sure you can publish your history as a book and it will be a bestseller on the market.
    I’m waiting for the next post.

  2. Thank you David for sharing your CAAT’s “learning curve” with us. Some of us experienced the same arguments against accessing all of the data. 🙂 The results of CAATs using analytic tools on 100% of transactional data are not always pleasant for auditees, so that I always tried to understand their “rationalizations” against accessing all the data, and that always made me wonder if their “rationalizations” were actually, part of the well known “triangle”…

    1. Kreso – your comment is insightful and highlights the importance of understanding operational realities as well as other possible rationalizations for not providing the data. Some of which may have a validity and need to be addressed by auditor flexibility and ingenuity. As for the fraud triangle, your comment reminded me of a couple of times where the refusal to provide the data was exactly that – fear that the fraud would be (and was) revealed by analysis.

  3. Your list of excuses took me back to those we heard at the hospital. In variation of “you can’t have personal data”, we also heard “These data have patient information protected by HIPAA and you can’t have it”. Another excuse we heard was “this system is old. If we try to get these data, it will crash and we will lose everything so you can’t have it”. Apart from being an exaggeration, this latter excuse tends to overlook the fact of system backups. In one case, rather than argue, we requested the most recent backup of the data instead which was a harder request to fight although they did try to fight it anyway. It was my experience that refusals to supply data stemmed from one or more of three things:
    * misunderstanding of audit’s purpose,
    * management resentment of audit oversight, and/or
    * fear of reprisals if something was found.
    Ours was a very political environment which made this kind of resistance commonplace and very difficult to overcome as rational responses to data request roadblocks were belittled and/or ignored.

    1. Steve – thanks for your comments and additions to an age-old problem: justifying audit’s need and authority to have the data.

Leave a comment