I have been using ACL software for more than 25 years – back before the Windows version even existed. It wasn’t the first audit software that I had used, and I was pretty sure it wouldn’t be the last audit software I would use. Previously I had used AutoAudit, Audit Analyzer, Decision Analyzer, IDEA, ReportWriter, Easytrieve and various spreadsheet packages (Mulitplan, Visicalc, and Lotus1-2-3). I told myself that I would give ACL a try until it couldn’t do what I needed and I found another package did. As I mentioned that was more than 25 years ago – and I am still putting ACL through its paces and looking at other software.
Sticking with a particular software product for this length of time has its advantages. You build up a certain level of expertise (mainly by trying, failing and trying again). I must have about 10,000 scripts to do various analyses that I have performed in the past. However, it also has a distinct disadvantage: you have to keep up with the new functionalities being released. This has become more of an issue of late as major releases seem to be occurring every six months. I still have not spent the time and effort to understand the REGEX() and other REG-type functions. The new interfaces and ODBC connectors, Python and R, even EXECUTE (A return of the RUN functionality) are challenging me to keep abreast with the additional power and flexibility of the AN product.
In addition, if you throw in AX and GRC – with working papers, exception reporting and monitoring, storyboarding, etc. – I am hopelessly behind in my use of ACL. Fortunately, I have retired – well at least semi-retired – and can rest on my laurels. I only take jobs that require knowledge of AN and in areas like REGEX() I can post to the Peer Community and get instantaneous help from around the world.
In 2001, I published a book called the “Fraud Toolkit” which I updated in 2009 under the title “Fraud Analysis Techniques Using ACL”. The books included more than 50 scripts to search for fraud and a self-study course on scripting with data and exercises. At the time, certain functionalities did not exist with the ACL command set, so I built scripts to perform the desired analysis. My scripts included: Crosstab, Bendford Analysis, Frequently Occurring Numeric Values, and Min/Max Ratio analysis. Since then ACL has added these functionalities. Crosstab – added 2010; Benford analysis – added 2010; Frequency of numeric value (Classify on a Number) – added 2015; and Min/Max ratio – added 2015. (Note: some of the dates may be incorrect – hard to remember exact versions when ACL commands were added, but it was years after the toolkit was developed). Many of you are aware of Crosstabs and Bendford commands in ACL, but you may not have recognized the addition of the Min/Max ratio.
The Min/Max ratio was accomplished by a script that determined the minimum, maximum and average value for any unique value of a character field. For example, the minimum, maximum and average value of credit card transactions for each credit card number. The script was not particularly complicated, but it did involve the use of GROUP-END logic. Recently (2015) ACL added the ability to find the Minimum, Maximum and average values when you perform Classify, Summarize and Stratify.
This is the functionality I would like discuss, with examples, today. More than 15 years ago I thought it was a useful analysis and devoted hours to developing my Min/Max script (I also had a Max/Max2 script). Now it is available to all users, but like me, you might not be aware of the functionality of all enhancements.
In my first example, I will look at invoices paid to vendors. In my case, the file I was working with was fairly small – less than 18,000 invoices paid to 6,000 different vendors. As in the past, I performed various analyses such as STATISTICS to find the min, max and average invoice amount; STRATIFY to look atthe ranges of invoice amounts; SUMMARIZE to identify the vendors with the largest number and total dollars of invoices; and of course DUPLICATES. All of these commands have provided enormous value and help me to understand our business and the A/P process. For example, STRATIFY showed me that 63% of the invoices were for amounts less than $500.00 and totaled less 0.5% of the total amount.
Stratify on Amount
This caused me to question the efficiency of our p-card program for small dollar purchases (e.g. under $500). It also show me that a small number (59) of invoices were for large amount (>$1M) and accounted for 83.47% of the total dollars – guess which invoices we were going to review?
But the addition of the Statistic option in the CLASSIFY command provided insight into the 6,000 vendors. This was something that was not available before, unless you had my Min/Max ratio analysis script.
Classify on Vendor with Statistics Option
There were 13 vendors where the total number of invoices was greater than 10 and the amount of the maximum invoice was more than 75% of the total amount of invoices. In other words, one invoice was significantly larger than the other invoices for the same vendor.
Looking at the second vendor (#241380), the largest invoice amount is $76,550.00; the average amount is $4,759.63: and the maximum invoice is 89% of the total for that vendor and more than 16 times the average amount. Upon review we determined that the actual invoice amount was $765.50 and the amount had been entered incorrectly. The same was true for vendor #301549 with a maximum amount of $16,975.00 (78% of the total) which should have been 1,697.50.
Vendor #333270 was a different problem – the $94,333.09 was actually owed to vendor #333207. When this vendor complained that they had not been paid, we paid them with a late payment interest penalty. The system did not recognize it as a duplicate because the vendor numbers were different.
The three vendor results below represent another type of problem. Either we paid the wrong vendor or paid the wrong amount – but the vendor returned the check.
In each of the above cases, you can see that the minimum ABS(minimum amount) = maximum amount e.g. Vendor #276053 had a minimum of ($103,000.00) and a maximum of $103,000.00. The vendor was not owed $103,000.00 and returned the check.
I hope that you can see the value of the Man/Max functionality. But just to make sure, let me give you some additional examples of where I have used it:
- Payroll – Classify on Job Category Subtotal Amount – identified people in job categories that were being paid incorrect (inflated) amounts (payroll fraud)
- Medical Claims – Classify on Procedure Code Subtotal Amount – identified doctors who were charging 2-3 times the normal amount for medical procedures (medical claim fraud)
- P-Card – Classify on Card Number Subtotal Amount – identified cardholders with a single large transaction (cards had been compromised)
- P-Card – Classify on cardholder position Subtotal Amount – identified unusual transactions for admin clerks (a clerk was using the p-card for personal purchases)
- Payroll Overtime – Classify on Job Category Subtotal Amount – identified categories where employee were not entitle to, but were receiving, overtime and individuals who were making false overtime claims
- Contracting – Classify on Contract Type Subtotal Amount – identified sole source contract that should have been competitive
- Contracting – Classify on Supplier Subtotal Amount – identified cases were certain suppliers received large contracts at year-end – bypassing procurement controls
- Journal Vouchers – Classify on Period Subtotal JV Amount – identified abnormal JV’s just prior to production of quarterly financial statements.
- Vacation – Classify on Employee Type Subtotal Vacation Days – identified an employee who coded 40 hours of vacation as 5 (i.e. days) and the system only subtract 5 hours from his vacation balance. Also identified people who were claiming large amounts of vacation. System controls were not working effectively and employee could take more vacation days than they had in their balance (i.e. ended with a negative balance).
So, even though I had not kept up worth the enhancements offered by the more recent versions of ACL, I encourage you to do so; and I hope you can see the benefit of the Statistics option on Classify and/or Summarize. I also hope that you post examples of where you have used this command parameter.