Why Audit Analytics Fail

Despite years of Chief Audit Executives stating the data analytics are a necessity for efficient and agile audit, and requiring auditors to have analytic skills, analytics still often fail to deliver. I have often heard. “We didn’t have time to investigate the results of the analytics”, “We didn’t have time to get the data”, “The data doesn’t have integrity”, ”We didn’t understand the results”, “We didn’t know what to recommend based on the results” and a myriad of other reasons (i.e. excuses) for not fully employing analytics or the results thereof.  But when you look at these statements closely you can see that they are not the cause for analytics to not be used, they are symptoms of the problem.  . 

For 35 years, my job was to support audit teams by developing analytics or supporting their use by audit teams, so I have hundreds of examples of failed analytics. Here are four:

  • The CAE of Audit Manager says, “Let’s include some analytics” and so a series of analytics were executed. None were used.
  • A junior member of the audit team (just back from a 3-day course on audit analytics) was asked to run “some analytics.” They produced over one hundred results. So many analytics that the auditor no longer understood what the results were or what risks were being addressed (e.g., invoices over $25,000; invoices over $50,000; invoices over $100,000)
  • Part way through the development of the audit program, the team lead decides to include analytics. “Get me the pay data and run some analytics like duplicate pay.” Insufficient time was provided to work with the Pay manager to obtain access to the data, it was not clear which fields were required, and analytics did not support the assessment of identified risks.
  • The data analyst (me) gave the audit team twenty results that were related to the audit objective two days before they left for the onsite work. The analytics were not used because the auditors did not have time to discuss them with local management, could not explain the results to the manager, and did not know what the results meant (e.g., what risk was being addressed and the impact).

The analytics do not support the audit objectives, they do not have defined and approved criteria, they are not part of the audit program, they are not testing controls that are linked to identified risks, the data integrity has not been assessed and considered, the analytics are not planned for, and the results are clearly explained. 

The problem is that the analytics are being run in a vacuum and/or treated as an end unto themselves. To combat this problem and improve the value-add of your analytics:

  • Data analytics should be integrated into the audit practice, based on the audit objectives, the risks, and the mitigating controls. (If your Accounts Payable audit objective does not include timeliness as a criterion, the analytics that identify late or early payments are not relevant. If your analysis of timeliness is based on 30 days and the organization pays in 90 days, then your analytics will not be relevant).
  • Their use should be planned, and time included for accessing the data, assessing its integrity (and having a strategy to deal with instances of poor data quality), cleansing the data, and developing and evaluating the analytics.
  • The steps to use the analytics (validating and understanding) must be included in the audit program and directly tied to sub-objectives.
  • Analytics must be focused not only on specific risks and the associated controls but should also  identify anomalies. You should apply a combination of tests. There should be specific tests that are designed to look for known control weaknesses and frauds (i.e., the ‘known’). However, you should also include tests that employ AI to identify the anomalies that you are not even aware of might occur (i.e., the ‘unknown’).
  • The results should identify the impact – the control weaknesses or failures or the magnitude of the associated risk.
  • The audit team should fully understand how the results address the risks and support the development of recommendations to address control weaknesses. The results must be understood, properly interpreted, and specific action-oriented recommendations be developed.

In short, analytics must be planned, integrated into the audit process, and linked to objectives, risk, and controls. In addition, the audit validating the results must understand how to work back from results to controls, risks and objectives. This will ensure that your analytics are not run in a vacuum and improve their efficiency, effectiveness, and value-add.

Dave Coderre

CAATS (www.caats.ca)

This article has 4 Comments

  1. Great post, Dave! (as usual) Your suggestions about how to improve the value-add are spot-on and very, very well articulated. In my experience with audit groups, the biggest obstacles to valuable analytics start with lack of understanding/vision about why analytics should be used, and if the “why” is clear, how analytics should be used to fulfill the “why”. Relatively few audit groups have good “why” vision and effective gameplans on how to use analytics, which results in unclear objectives/goals, poor results/ROI, and low budgets to invest in getting any better. You hit the nail on the head with so many examples of futility, and we know there are so many more.

    The only thing I’d add to the thought is that continuous auditing as an “ongoing audit” paradigm even further requires what you suggested. In that case, there should be a socialized constant attention to what will be set up for continuous auditing, how risk will be identified (regularly updated with business info), and how/if results will be addressed (in terms of communication, recommendation, reporting, etc.), since ongoing scheduled analytics do not usually have those process milestones of audit plan/program creation, fieldwork, reporting, followup, etc.

    1. Paul I appreciate your comments and insights particularly those on continuous auditing. The notion of continuous auditing has always bothered me but I could not determine why. Your insights on it are a revelation. My view was always limited to the fact that audit should be identifying the root cause and making appropriate recommendations to mitigate the risk. If audit did this, why would they have to do another (continuous) audit in the same area? Your comments add another dimension ~ assessing and managing the continuous audit activity. Thanks for that.

      I would add, just because one can do analytics does not mean that the results are valuable. “Build it and they will come” is not true.

  2. A lot of failures in internal controls in the public sector are making the news lately, Dave. Would love to hear your thoughts on those, if anything is jumping out at you.

    1. “Making the news lately”? I would argue that there are no more instances of controls failures that at any time in the past. Internal audit needs to continually identify and assess risks. This means that the risk-based audit plan must be dynamic and not something that is only updated every other year. Also, audit must be risk-ready and agile so that it can react in a timely manner to increasing levels of risk.

Leave a Reply

Your email address will not be published. Required fields are marked *