Year 4 – 1991 – Up and running

By 1991 the idea of using data analytics to support internal audit was firmly in place in the organization.  I was producing monthly reports which described how analytics was used by various audit teams to improve efficiency, to expand the scope, to arrive at better findings and to fully test controls (i.e. not using samples).   The analytics team (still only two people) had developed CAATTs (Computer-Assisted Audit Tools and Techniques) manuals to describe the financial and Inventory data to which we had access; and we were working on a manual for the HR system.  These manuals included a series of standard tests that could be requested by the auditors as well as a description of the fields that were available so that ad hoc requests could be performed.  We were accessing approximately 25-30 information systems a year; 7-8 were accessed on a regular basis and the others were used occasionally or on a one-time basis.  For the regular systems, we had arranged for standard extracts to be produced on a monthly basis and we were beginning the process of creating multi-year summaries (e.g. summary by General Ledger account by Year for the past 3 years).  This allowed us to start looking at trends in the data such as the usage of overtime or professional services compared to regular salary dollars.  In the future, we would be able to use this information to contribute to the annual risk-based audit plan (but I am getting ahead of myself).  For now, it supported the planning phase of the audit – expanding the analytics input beyond the conduct phase.

The analytics team tried to meet team leaders early in the planning phase to determine their data requirements and to encourage the use of analytics during planning, conduct and even reporting.  It was still very much a “push” rather than a “pull” so we had to understand their requirements and sell them on the use of analytics – but it was getting easier as we racked-up success stories which we published every month.

The analytics team still had other duties.  I was still responsible for leading audits in IT-related areas and the other data analyst was responsible for our LAN and the development of applications to support the internal audit function (e.g. developed a time reporting system).  But we gained lots of valuable experience and expertise and we each supporting 3-4 concurrent audits.

I was assigned to support one of my first operational audits.  We had a fleet of trucks that moved inventory and supplies from place to place.  The trucks ranged in size from 40ft 18-wheelers to vans and the audit was looking at the efficient allocation of trucks based on requirements (i.e. did we need an 18-wheeler for this route or would a 20ft truck suffice).  Most routes had multiple stops – loading and unloading at each stop.  The data contained both the type of truck (capacity in terms of volume and weight) and the amount loaded and unloaded (volume and weight) at each stop.  Using ACL I was able to calculate the volume and weight for each truck at every stop and determine the maximum required for the route.  By calculating the maximum size for each trip for the entire year, I was able determine the size of truck required for each route.  I was even able to further define the requirements by month or even day of week – however the process of assigning different sized trucks to routes on a daily or even monthly basis was beyond operations current capability so we settled on quarterly assignments using the maximum size truck at any point in the quarter.  Since we did not own any of the trucks, we had a fair bit of flexibility and any reduction in the size of truck resulted in immediate savings, both rental costs and gas.

The analysis determined that 30% of the routes could reduce the size of truck down to the next size; and 10% of the routes could reduce by two sizes.  The saving would be about $2M for a year.  At first management was unconvinced, but we tested the truck sizes calculated by the analysis on 10 routes for a month and, aside from what everyone agreed was an anomaly,  it proved to be correct and was implemented across the board.   The savings for a full year was calculated at $2.4M (7.2M over three years).

I should mention that the operations management section was working on a COBOL program to perform a similar calculation.  The program was over 5,000 lines long and was not working correctly; whereas the ACL program I wrote was less than 70 lines and was proven to be correct.

Another audit I supported looked at the unit price paid for various standard items.  A simple analysis identified the minimum, maximum and average price paid for each item.  For the most part, there was a great deal of consistency (i.e. the ratio of the Max/Min was close to 1.0) and where it wasn’t there we good reasons (i.e. increase in quality, urgency of purchase).  However it also identified my first case of fraud wherein a contracting officer was paying more for standard items in exchange for kickbacks.  It was all very exciting until I realized that it wasn’t simply an analytical anomaly, but an instance where someone would be fired and perhaps even go to jail.  We called the police and I was asked to work directly with them.  The police officer in charge noticed that I was feeling responsible (I had identified the fraudster) and asked me if she should feel bad every time she arrested someone for a crime.  That helped me to understand that it wasn’t my fault that the person was arrested, but I still knew that if the controls had been better the opportunity would not have been there to allow the person to commit the fraud.  Note: Fraud increases when opportunity, pressure and rationalization are present (Donald R. Cressey, Other People’s Money; Montclair: Paterson Smith; 1973 p.30); in this case the weakness in the controls had provided an opportunity.

Even though I was not a fraud investigator or forensic auditor this would not be the last time I discovered a fraud through data analysis.

Analysis: ACL Commands: SUMMARIZE; EXPRESSION; GROUP; CLASSIFY (with Statistics for Min and Max)

Lessons-learned: It is important to use the right tool for the job.  COBOL was good in its day and was still very good for many things but it couldn’t compete with an analysis tool like ACL that comes pre-loaded with routines to make the analysis easier.  The operations folks were better programmers than I was, but they lacked decent tools to do the job.  They were also anxious to learn, so the second lesson-learned was that audit could contribute more than a report with recommendations.  We described the logic behind our analysis so that the functional area could build its own monitoring program.  This was the beginning of something we called “transfer of audit technology”.  Moving forward 15 years, this would be similar to Continuous Auditing tests being handed over to management as part of their Continuous Monitoring.

I also realized that control weaknesses can lead to good people to make bad choices and that audit had a responsibility to protect people from having the opportunity to make bad choices by ensuring that the controls are working properly.

Lastly, data analytics must not only constantly produce results, but also must be promoted and sold.  In later years I would find this extremely frustrating – constantly having to push the use of analytics despite all our successes.  Even worse, when management changed, we had to justify the meagre expenditures on analytics all over again.  Therefore, a word to the wise, track your ROI and never rest on your laurels.  You are only as good as your next analysis.

This article has 1 Comment

Leave a Reply to Williamml Cancel reply

Your email address will not be published. Required fields are marked *