Year 21 – 2008 – part 1 – Ensuring Integrity

 Accessing different systems, trying to address auditor requirements, and performing complex analysis – they all present risks.  And while I have had a great deal of success, there also have been many mistakes.  I once heard it said, “learn from others mistakes – you don’t have enough time to make them all yourself” or something similar.  This is why I always try to post a lesson-learned and this post is no different.

Here are three audits where my analysis was less than perfect – but where I learned valuable lessons.

  1. Expense advances – I was supporting an audit of advances and extracted all transactions related to travel expenses from the SAP system. The filter was a combination of a document type and a GL code.  When I presented the auditor with the extracted data, I told her “Here are all of the travel advance transactions; the total is $23M – be sure to verify this with the client.  Six months later after additional analysis and other audit procedures – the draft report was given to senior management.  They replied, “$23M, it should be much closer to $61M.”  Turns out there were two types of advances (excluding salary advances), travel advances and sensitive expenditures.  We had only extracted the travel advances.  Now I could (and probably did) argue that this was not my fault – I had told the auditor to verify the data.  But as the SAP and data expert – I should have done more to ensure that I was providing a complete set of data to support the audit.  Part of doing this would have been to ask the auditor to supply the “audit objective”.  In this case, the objective was not “to verify the controls over travel advances”; but “to verify controls over expense advances”.  After this mistake, I was also sure to get the audit objectives and to ensure that my understanding – and the data that I would be extracting – agreed with the auditor’s understanding.  From then on, I also checked to see if the auditor had verified the accuracy and completeness of the data.

  1. P-Card transactions – I was asked to perform a review of p-card transactions. During the course of performing the analysis, I highlight three transactions that were of interest.  The first was for $1,200 from a liquor store; the second was $6,000 at a golf and country club; and the third was $250 at strip club.  The first two were processed against regional p-cards and the third was a headquarters p-card.   I sent the two regions an email stating that as a result of a p-card audit, a transaction had been identified for follow-up.  I asked them to check into the validity of the expense.  The proverbial hit the fan and the manger in each of the regions – cancelled the credit card in question and told the local manager to arrange for an interview with the cardholder.  Turns out the transactions were totally valid.  The first was the buy alcohol for use in the breathalyzer course – half the class would drink varying amounts of alcohol and the other half would administer the breathalyzer.   The second was related to a search and rescue helicopter – the pilot had damaged a golf green making an emergency landing and the charge was for the repairs.

I handled the third transaction myself since the cardholder was in a building near the audit office.  The cardholder was 6’4” and had tattoos a long beard and was dressed like a biker – very intimidating.  When questioned he smashed his hands on the desk and yelled at me for being an idiot – he told me he was undercover and had to go to the strip joint.  “You can’t just sit there and drink water – you have to maintain your cover”, he bellowed.  The he told me to get out and stop wasting his time.  It almost worked – I was just about to leave when I thought “if you are undercover, why are you using your police department p-card?”  He laughed and said “busted” – which meant that I had caught him making personal purchases on his p-card.

This taught me to be extremely careful, not only in terms of validating my analysis, but also when sharing the analysis results with others.  The more I can do to verify the results prior to releasing them the better.

  1. Airline travel – this audit was early in my career (probably around 1991-92). I was asked to check to see if employees were collecting frequent flyer points when traveling on company business – at the time this was not permitted.  I obtained access to the Sabre system (airline reservation system) for our company and extracted all the airline transactions.  Each trip had multiple records or different types.  One of the records included a field which was being used to store the frequent flyer card number.  It was a simple matter of obtaining the passenger name and then checking to see if the passenger had provided a frequent flyer number – or so I thought.  Turns out, I was accessing the “booking” data, not the “flown” data, so there was no way to be sure that the points had actually been collected if the booking record contained a frequent flyer number because the passenger could remove the number at the gate.  In addition, passengers could add a frequent flyer number at the gate – or even after the flight had taken place.

From this point one, I always tried to verify with the client that the data I was going to use – could support the audit objective.  In this case, booking data would not support the objective of determining if employees were collecting points on their business flights.

I had many other mistakes – not being clear when requesting data, failing to ensure the completeness of the data extract, incorrect table definitions, etc.  So, over the course of the years – particularly after making mistakes – I put together a list of steps that I call a Generic Approach to the Application of Data Analysis to Audit.  I present them to you in the hopes that you will learn from my mistakes.

  1. Meet with the audit team leader and ensure that you understand the goals and objectives of the audit.
  2. Based upon the audit objectives, identify the potential data sources.
  3. Meet with the client and the programmer for the client applications. Identify all available databases both: Internal to the client organization – main application systems; and sources external to the client organization.
  4. List fields in all available databases and the standard reports that are available.
  5. Select the data source and obtain client agreement that the data can be used to address the audit objectives (agreement on data source, fields, timing, integrity, etc.)
  6. Request the required data – trying to ensure that all necessary fields are included. Prepare a formal request for the required data, specifying:
  • the data source(s) and key fields,
  • the timing of the data (for example: as of Sept 30, 2016),
  • the data transfer format (LAN, CD ROM, USB, FTP, etc.),
  • the data format (Unicode, ASCII, EBCDIC, etc.)
  • the file format (DBF, Delimited, Excel, Access, flat file, ASCII print file, etc.),
  • control totals (number of records, totals for key numeric fields),
  • record layout (field name, start position, length, type, format, description),
  • a print of the first 100 records
  1. Create or Build the ACL table layout and verify the data integrity:
  • use the Verify Command – to check data integrity,
  • check ACL totals against control totals,
  • check the timing of the data to ensure proper file has been sent,
  • compare ACL view with a printout of first 100 records
  1. Understand the data and the business process – use ACL commands COUNT, STATISTICS, STRATIFY, CLASSIFY, etc. to develop an overview of the data and to better understand the business process.
  2. For each audit objective
  • formulate hypotheses about field and record relationships
  • use ACL to perform analytical tests for each hypothesis
  • run tests
  • evaluate initial results and refine the tests
  • re-run and refine test to produce more meaningful
  • evaluate the results – using record analysis, interview, or other techniques – to examine every item on the refined results
  • form an audit opinion on every item in your results. For each you should be able to say that the record is OK – there is a valid explanation; or that it is a probable improper transaction and more review is needed
  1.  Quality Assurance and Documentation – verify integrity of the analysis and the interpretation of the results. Identify exceptions; confirm analysis and nature of the exceptions; and identify reasons for the exceptions.

I hope these points help you to perform better analysis and that you will add your own steps.

Leave a Reply

Your email address will not be published. Required fields are marked *