Year 20 – 2007 – Inventory

It was hard to believe, but I had now been at this (data analytics to support audit) for 20 years.  And I still found it interesting, challenging, frustrating, rewarding and aggravating – all at once.

I was constantly being asked to access new systems and perform analysis for different types of audits.  At the same time, I had my regular monthly routine tasks of extracting, downloading and cleansing data we used on a regular basis.  For example, the SAP extract – full year-to-date extracted and download every period – would take most of the day to perform by the time I got to period 8.  I could only download one period at a time because of CPU limitations – so I would start a background extract of period 1 and work on other things.  When it finished, I would extract period 2 and download period 1; and so on until I reached the current period (AX and DirectLink would have made things much simpler).  In addition, I had to extract and download the 12 master tables (vendor, customer, cost centre, GL, etc) that I needed every quarter.

Once all year-to-date extracts had been performed, I had a script that combined the periods and transformed the detailed transaction (BSEG table) and the header (BKPF Table) into a more useful data set where the customer and vendor information was on every line of a document.  The script also produced a snapshot of the controls and summary files (by GL; by Cost Centre; by Vendor; etc.).  Next I would combine data from the previous “X” years to produce multi-year summaries (by GL by year; by Cost Centre by year; etc.).

So my analytics was a combination of fairly routine and boring, but essential stuff; and new (sometimes exciting) work.  I figure the work broken down into the following percentages: routine extract, download and data prep work (20%); discussions and work with audit teams (30%); quality assurance on work done by teams (10%); complex analysis to support audits or senior management questions (35%); accessing new data sources (20%); research into new approaches (10%).  The auditors in the group will notice that this is more than 100% and it doesn’t include: managing staff; meetings; training (of self and others); administration; and the occasional vacation.

Audit example – Inventory.  We were examining the inventory management system – primarily looking for ways to improve the efficiency of the system.  The company had numerous warehouses supplying parts to vital major pieces of equipment (e.g. carburetors for bulldozers).  Management wanted assurance that the process was efficient and effective.

We accessed the data for the current parts, but also wanted to look at trends such as inventory turnaround.  This meant we had to access other data such as issues of inventory, write-offs, strike-offs, and disposals.  By combining the data for different files we were able to get a better picture of the purchases, receipts, issues, and returns; as well as the life expectancy of specific parts.  This highlight one company in particular whose parts had a much shorter life expectancy.  (A subsequent audit determined that the parts were of lower quality than we were paying for).  There was a standard report that performed a similar analysis, but it was not being used by anyone.

Since we were combining items from various data sources and parts could flow through the system from receipt to issue to return, repair and re-issue; before final disposal, we wanted to make sure that we were not counting items two and three times.  Part of the completeness and integrity testing included a search for duplicate serial numbers.  We had expected to find some – and we did.  However, we did not expect to find duplicate “purchase” records for the same piece of equipment.

For the items that had been purchased twice (with many months between the original and second purchase) we planned to trace the equipment through the process: purchase; receipt and entry into inventory; issue to a manager for use in the equipment; return for repair; and either re-issue or disposal.  Many of the parts went directly from purchase and receipt to repair without ever being issued.  This didn’t make sense since we should have returned the defective part to the supplier if there were problems.  Further, the parts sent to repair were then deemed unrepairable and disposed of.  Several weeks later they were re-purchased.

We examined the users at each stage and found that the “repair” and “unrepairable” transactions were all being entered by two people: “A” would send the part for repair; “B” would declare it unrepairable; and “A” would send it to scrap.  The two were working together to bypass a separation of duties control preventing a single user from sending an item for repair, declaring it unrepairable and send it to scrap.

ACL Commands: FILTER, EXPRESSIONS, AGE, STATISTICS, CLASSIFY, DUPLICATES, and SCRIPTS.

Lessons_Learned – The obvious lesson-learned is that separation of duties controls can be bypassed by collusion.  Given that this is a risk, management should have considered have additional detective controls.  We recommended several simple tests such as: duplicate serial numbers being purchased; and time between receipt and repair transactions.  But is comes beyond the specific problems we discovered.  Fraud risk related to a compromised separation of duties control should be considered for every audit.  Separation of duties is a preventive control – audit should look for the existence (and test) the detective controls.  Controls are more effective when there is a combination of preventive, detective and corrective controls for a given risk.

Secondly, just because there is a standard report that addresses a potential risk (e.g. poor quality parts), it does not mean that management is reviewing or acting upon the information contained in the report.

Third, data analysis is more than conducting challenging analysis.  It requires you to be vigilant in obtaining, cleansing, preparing and ensuring that the data is available for when it is required.  This includes routine extracts and downloads.

Lastly, when performing analysis it is very important to review and understand what the results are telling you.  In this case we expected duplicates and were really only trying to identify them so we could remove them from further analysis.  If all we did was remove the duplicates, we never would have found the fraud.  Even when you are expecting certain results – you should examine the results to determine what is really happening.  Duplicates occurred because of timing issues, and the way the process worked.  But our analysis of the duplicates also revealed that there were other duplicates that were not a normal part of the process.  Only by reviewing the duplicate test results were we aware the fraud.  As you become more proficient with ACL and write scripts that use temporary (intermediate) files, do not be so quick to “DELETE temp.fil OK” before actually looking at the results – you may be surprised.

Leave a Reply

Your email address will not be published. Required fields are marked *