Year 7 – 1994 – Transfer of Audit Analysis to Mgt

Having been a member of the IIA since 1990, I always looked forward to the Internal Auditor magazine.  However, it rarely included articles on computer-assisted audit tools and techniques (CAATTs).  I wrote the first of several articles on data analytics “Computer-Assisted Audit Tools and Techniques: The Power of CAATT is turning up the ‘can-do’ potential of some audit shops”.  It was published in February 1993 and, to my knowledge, was the first time that the acronym “CAATTs” had been used anywhere.  Previously there was only one “T” as in “CAATs” but it was never clear to me if the “T” stood for tools or techniques – having two “Ts” solved that issue.  I also proposed the establishment of a regular column called “Computers and Auditing” and I, along with James Kaplan, were the first co-editors.  The column, replacing “PC Exchange”, debuted in February 1994 with my first column “Auditmation” which included four different audits where analysis had been used.  James and I were co-editors for several years and I wrote many columns on data analytics and audit.  These became the base=is for a book I wrote several years later.

At my regular job, I worked on a repair and overhaul audit.  Our company had contracted out the maintenance of specialized equipment to several vendors.  As per the contract terms, the vendors were required to maintain an inventory of critical parts for which we paid the storage costs and the purchase price when used.  The vendors were also loaned specialized test equipment to be used for repairs and testing of our equipment.  The audit objectives included the verification that the vendors were complying with the terms and conditions of the contracts.

First we obtained detailed financial transactions related to repair and overhaul expenditures.  These were summarized by vendor to determine which vendors had provided repair and overhaul services and the amount.  We had contracts with a total of eight vendors; however the financial summary revealed that we had only obtained repair and overhaul services from six of those companies.  In fact, looking a previous year’s data determined that the two other vendors had done no work for us for three years.  During this time frame, they still had millions of dollars worth of test equipment and had been charging us for the storage of inventory (spare parts).

The auditors went to the two firms in question and determined that the test equipment was being used for repairs to other company’s equipment and that the quantity of spare parts in inventory was less than half of what we were being charged for storage.  A review of the other six companies found similar issues – our test equipment and parts were being used for other purposes and there was insufficient inventory to support the storage charges.  The audit resulted in a refund of storage charges, the return of test equipment and payment for the use of our equipment – a total of $2.5M.

The analysis also pointed to an emerging risk – the number of forms that could perform repair and overhaul on our equipment was dwindling.  Three years early there were 10-12 firms and currently there were only six.  Further, two of the firms were in the process of mergers which would bring the total down to four.   We decided that we should: alert management to the reduced number of vendors capable of supporting our requirements; and added this to the list of external risks we were tracking.


Lessons-learned:   Having performed or supported many audits, I knew first-hand that the data was available and the analysis was useful.  But this was not sufficient.  Often we would make value-added recommendations which were not implemented – despite management’s “agreement” to the findings and even the submission of a management action plan which was included in the audit report.  Numerous follow-up audits found that recommendations were not always implemented.  We tried hard to ensure that management agreed and even had input into the recommendations; that the recommendations took into account the operational environment and constraints in terms of people, time and money.  We needed to provide management with the ability to perform better monitoring and ensure our recommendations considered operational constrains.

I was also keenly aware that management did not always have the information they needed to make informed decisions.  We would perform analysis and present results to management and their reply, after asking where we got the information – it was from their data/system – was often “if I had known this, I would be able to manage better”.  This led me to the realization that the audit report (and recommendations) was only part of the value-add that audit could provide.  We coined the term “transfer of audit technology” to describe the transfer of our audit analysis to management for their ongoing use.  Sometimes this entailed IT developing a standard report which duplicated our analysis; or management running ACL scripts, we had developed, as part of their monitoring process.  Little did I know at the time that our efforts to assist management improve their ability to perform analysis would be the beginning of what was to be called Continuous Auditing/Continuous Monitoring many years later.

Providing management with tools and technology helped to improve the testing of controls or efficiency of operations.  It increased the likelihood that our recommendations would be acted upon.  It was also recognized as a value-add contribution from internal audit.  The issue now was – we were in demand.  Other areas wanted us to assist them with various types of analytics.  Finance was even calling us to help them answer questions they were getting from senior management and external regulatory bodies.  We had to be careful not to maintain our independence and not become part of the control or management monitoring process.

Lastly, I was beginning to see the value of data analysis in the area of risk assessment – not just existing risks, but also emerging risks.

This article has 2 Comments

  1. Dave, thanks for your great blog. You touched on something in this post that I struggle with – the transfer of audit analysis to management. What does it look like when management runs ACL scripts that we in IA develop? Does management purchase their own licenses and train someone to run the scripts on their own machines? Do you grant management users access to specific projects in AX which have the scripts (assuming you have AX)? How do you manage the relationship when changes have to be made to the scripts? I would assume management would not have the expertise to edit the scripts – how do you keep independence while also meeting management’s requirements? Thanks.

    1. Thomas – when I have (successfully) transferred scripts to management – which is not always the case – management takes on responsibility for the analysis. This includes determine the analysis to be performed, running the analysis and reviewing the results. I do not have AX, so management would purchase their own copy of ACL; and I would train them on how to use ACL and edit the script. This could simply be “how to run the scripts” or could include a 3-day course on ACL. They are fully responsible for maintaining the integrity of the analysis. Sometimes they have come back with requests to add more analyses or modify existing analysis – but it is clear that they are responsible for the logic, the testing, etc.

      What are others doing?

Leave a comment