Despite years of Chief Audit Executives stating the data analytics are a necessity for efficient and agile audit, and requiring auditors to have analytic skills, analytics still often fail to deliver. I have often heard. “We didn’t have time to investigate the results of the analytics”, “We didn’t have time to get the data”, “The data doesn’t have integrity”, ”We didn’t understand the results”, “We didn’t know what to recommend based on the results” and a myriad of other reasons (i.e. excuses) for not fully employing analytics or the results thereof. But when you look at these statements closely you can see that they are not the cause for analytics to not be used, they are symptoms of the problem. .
For 35 years, my job was to support audit teams by developing analytics or supporting their use by audit teams, so I have hundreds of examples of failed analytics. Here are four:
- The CAE of Audit Manager says, “Let’s include some analytics” and so a series of analytics were executed. None were used.
- A junior member of the audit team (just back from a 3-day course on audit analytics) was asked to run “some analytics.” They produced over one hundred results. So many analytics that the auditor no longer understood what the results were or what risks were being addressed (e.g., invoices over $25,000; invoices over $50,000; invoices over $100,000)
- Part way through the development of the audit program, the team lead decides to include analytics. “Get me the pay data and run some analytics like duplicate pay.” Insufficient time was provided to work with the Pay manager to obtain access to the data, it was not clear which fields were required, and analytics did not support the assessment of identified risks.
- The data analyst (me) gave the audit team twenty results that were related to the audit objective two days before they left for the onsite work. The analytics were not used because the auditors did not have time to discuss them with local management, could not explain the results to the manager, and did not know what the results meant (e.g., what risk was being addressed and the impact).
The analytics do not support the audit objectives, they do not have defined and approved criteria, they are not part of the audit program, they are not testing controls that are linked to identified risks, the data integrity has not been assessed and considered, the analytics are not planned for, and the results are clearly explained.
The problem is that the analytics are being run in a vacuum and/or treated as an end unto themselves. To combat this problem and improve the value-add of your analytics:
- Data analytics should be integrated into the audit practice, based on the audit objectives, the risks, and the mitigating controls. (If your Accounts Payable audit objective does not include timeliness as a criterion, the analytics that identify late or early payments are not relevant. If your analysis of timeliness is based on 30 days and the organization pays in 90 days, then your analytics will not be relevant).
- Their use should be planned, and time included for accessing the data, assessing its integrity (and having a strategy to deal with instances of poor data quality), cleansing the data, and developing and evaluating the analytics.
- The steps to use the analytics (validating and understanding) must be included in the audit program and directly tied to sub-objectives.
- Analytics must be focused not only on specific risks and the associated controls but should also identify anomalies. You should apply a combination of tests. There should be specific tests that are designed to look for known control weaknesses and frauds (i.e., the ‘known’). However, you should also include tests that employ AI to identify the anomalies that you are not even aware of might occur (i.e., the ‘unknown’).
- The results should identify the impact – the control weaknesses or failures or the magnitude of the associated risk.
- The audit team should fully understand how the results address the risks and support the development of recommendations to address control weaknesses. The results must be understood, properly interpreted, and specific action-oriented recommendations be developed.
In short, analytics must be planned, integrated into the audit process, and linked to objectives, risk, and controls. In addition, the audit validating the results must understand how to work back from results to controls, risks and objectives. This will ensure that your analytics are not run in a vacuum and improve their efficiency, effectiveness, and value-add.