Simple Fraud Analytics in Accounts Payable

There is a lot of talk about Artificial Intelligent (AI) and unstructured data and how this will identify fraud and make life easier for financial monitoring, internal control and audit.  This all sounds great, but we are not even making use of the structured data we have and some basic analysis techniques.  How then can we jump to AI and obtain reliable, understandable, actionable results?

So, let me give you some simple examples of how you can use data analysis techniques to understand your accounts payable business process and find fraud indicators and control weaknesses.  I usually start with the data that supports the A/P business process, in this case: invoice number, vendor name and number, payment terms, invoice date, payment date, approver, entry clerk and amount.  Next, consider the risks associated with each data element (e.g. what would happen if the invoice number was changed – duplicate payments; what if a clerk made changes to the payment terms – early or late payments).  Then determine the analysis that will determine if the data is being manipulated.  So let’s look at some tests that can easily be performed using ACL software.

Improper entry of invoice numbers can lead to duplicate payments.  A simple addition of a period at the end of the invoice will allow invoice numbers “A123”, “a123” A 123” and “A123.” to be considered “not duplicates”.  Creating a invoice number that strips off all special characters and spaces goes a long way towards addressing this problem (Clean_InvNo = INCLUDE(UPPER(Invoice_No), “ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789” would change all the above invoice numbers to “A123”.

Improper entry of payment terms can lead to early or late payments.  Typically, the payment terms are stored in the vendor master data, but these can be changed when the invoice is entered.  Comparing the payment terms on the master record with the payment terms applied to the invoice will identify this problem.  Summarizing invoices, where this is the case, by entry user will determine who is doing this.

Duplicate vendor numbers can result is duplicate payments and payments made to fictitious vendors.  A simple test is to look for duplicates in the vendor master file.  Once again, it is important to use an intelligent duplicate test so the “AbC Corp” and “ABC Corporation” are considered duplicates.  With ACL you can normalize the data even addressing variations such as “The Corporate Group Ltd” and “Corporate Group Limited”.

User entry name is same as Vendor name – possible fraud.  By relating the user table to the detailed vendor payments, you can apply a filter that searches for instances where the name of the clerk doing the entry is similar to the vendor name (e.g. user “DCoderre” entered an invoice for “The Coderre Group Ltd”)

Vendor-Employee address match to employees with vendors.  First normalize the data (e.g. “St”, “St.” “Street”, etc. all become “St”).

Vendor only used by one user – possible fraud.  You can quickly summarize by Vendor and User to find cases where all payments to a vendor were processed by a single user.

You can also examine vendor payments to look for fictitious vendors.  Have you ever thought, “if only I could use analytics to find anomalies within subsets of my data – without having to isolate each subset individually.”  For example, identify payments to a vendor that don’t match that vendor’s normal payment profile.  Take the example of 500,000 invoices and 100,000 vendors.  You are not going to filter on each of the vendors and examine their payments.

If you have ever wished you could perform this type of analysis, whether it is to provide assurance or to find possible fraud, waste and abuse, it is now as easy as a click of the mouse.  A simple CLASSIFY or SUMMARIZE using ACL software provides you with a file that shows the total amount of payments, the average, minimum (lowest) and Maximum (highest) payment for each vendor.

Vend # Total Average Min Max Std Dev
94905 50,803,060.81 3,175,191.30 452.08 50,000,000.00 12,090,709.28
90481 5,526,793.81 425,137.99 5,033.60 1,603,609.08 399,603.44
38105 2,711,402.20 225,950.18 108,486.38 1,084,863.78 263,694.87
34326 2,835,667.83 24,657.98 152.62 2,000,000.00 185,222.23
94902 1,900,638.11 45,253.29 476.54 1,091,548.00 164,716.32
94899 999,449.05 43,454.31 425.52 700,000.00 141,744.58

 

Given the above example, this would still result in a file with 10,000 rows – but we are not done yet.  You can sort the summarized file in descending order on the standard deviation and then filter for counts (number of invoices) greater than 10.  This will present the vendor payments that are anomalies – for that vendor – and identify the 10-20 worth further review.   The review can start with pulling up the detailed payment files (500,000 records) a filtering for the vendor you want to examine.  Now you have a small number of transactions for a vendor, one of which is significantly larger than all the others.  Check to see if the vendor invoice matches the vendor’s invoicing sequence; look at who certified the invoice; the fund or responsibility center, etc.  Typically you will find one of the following: a fictitious payment; a payment to the wrong vendor; an incorrect amount; or an unusual, but appropriate, transaction.

The same type of analysis – examining transactions for slices of the data – can be applied to a wide variety of analyses.   Other examples areas include:

  • Payroll – Classify on Job Category Subtotal Amount – identifies people in job categories that were being paid incorrect (inflated) amounts (payroll fraud)
  • Medical Claims – Classify on Procedure Code Subtotal Amount – identifies doctors who were charging 2-3 times the normal amount for medical procedures (medical claim fraud)
  • P-Card – Classify on Card Number Subtotal Amount – identifies cardholders with a single large transaction (cards had been compromised)
  • P-Card – Classify on position Subtotal Amount – identifies unusual transactions for admin clerks (a clerk was using the p-card for personal purchases)
  • Payroll Overtime – Classify on Job Category Subtotal Amount – identifies categories where employee were not entitled to, but were receiving, overtime pay and individuals who were making false overtime claims
  • Contracting – Classify on Contract Type Subtotal Amount – identifies sole source contract that should have been competitive
  • Contracting – Classify on Supplier Subtotal Amount – identifies cases were certain suppliers received large contracts at year-end – bypassing procurement controls
  • Journal Vouchers – Classify on Period Subtotal JV Amount – identifies abnormal JV’s just prior to production of quarterly financial statements.
  • Vacation – Classify on Employee Type Subtotal Vacation Days – identifies an employee who coded 40 hours of vacation as 5 (i.e. days) and the system only subtracted 5 hours from his vacation balance. Also identified people who were claiming large amounts of vacation.  System controls were not working effectively and employee could take more vacation days than they had in their balance (i.e. ended with a negative balance).
  • Travel – expenses claims by traveller; expense claims by hotel to highlight fraudulent claims
  • Grants – payments by recipient to identify payments that are beyond the maximum allowable

Basically, any area where there are multiple categories and multiple receivers – this approach to identifying values beyond the usual can be applied.

I hope this gives you a taste of what can be done with ACL and simple analytics.  For more ideas, check out my blog www.caats.ca.

Dave Coderre

 

Leave a comment