Year 21 – 2008 – part 1 – Ensuring Integrity

 Accessing different systems, trying to address auditor requirements, and performing complex analysis – they all present risks.  And while I have had a great deal of success, there also have been many mistakes.  I once heard it said, “learn from others mistakes – you don’t have enough time to make them all yourself” or something similar.  This is why I always try to post a lesson-learned and this post is no different.

Here are three audits where my analysis was less than perfect – but where I learned valuable lessons.

  1. Expense advances – I was supporting an audit of advances and extracted all transactions related to travel expenses from the SAP system. The filter was a combination of a document type and a GL code.  When I presented the auditor with the extracted data, I told her “Here are all of the travel advance transactions; the total is $23M – be sure to verify this with the client.  Six months later after additional analysis and other audit procedures – the draft report was given to senior management.  They replied, “$23M, it should be much closer to $61M.”  Turns out there were two types of advances (excluding salary advances), travel advances and sensitive expenditures.  We had only extracted the travel advances.  Now I could (and probably did) argue that this was not my fault – I had told the auditor to verify the data.  But as the SAP and data expert – I should have done more to ensure that I was providing a complete set of data to support the audit.  Part of doing this would have been to ask the auditor to supply the “audit objective”.  In this case, the objective was not “to verify the controls over travel advances”; but “to verify controls over expense advances”.  After this mistake, I was also sure to get the audit objectives and to ensure that my understanding – and the data that I would be extracting – agreed with the auditor’s understanding.  From then on, I also checked to see if the auditor had verified the accuracy and completeness of the data.

Continue reading Year 21 – 2008 – part 1 – Ensuring Integrity

Year 20 – 2007 – Inventory

It was hard to believe, but I had now been at this (data analytics to support audit) for 20 years.  And I still found it interesting, challenging, frustrating, rewarding and aggravating – all at once.

I was constantly being asked to access new systems and perform analysis for different types of audits.  At the same time, I had my regular monthly routine tasks of extracting, downloading and cleansing data we used on a regular basis.  For example, the SAP extract – full year-to-date extracted and download every period – would take most of the day to perform by the time I got to period 8.  I could only download one period at a time because of CPU limitations – so I would start a background extract of period 1 and work on other things.  When it finished, I would extract period 2 and download period 1; and so on until I reached the current period (AX and DirectLink would have made things much simpler).  In addition, I had to extract and download the 12 master tables (vendor, customer, cost centre, GL, etc) that I needed every quarter.

Once all year-to-date extracts had been performed, I had a script that combined the periods and transformed the detailed transaction (BSEG table) and the header (BKPF Table) into a more useful data set where the customer and vendor information was on every line of a document.  The script also produced a snapshot of the controls and summary files (by GL; by Cost Centre; by Vendor; etc.).  Next I would combine data from the previous “X” years to produce multi-year summaries (by GL by year; by Cost Centre by year; etc.).

Continue reading Year 20 – 2007 – Inventory

Year 19 – 2006 – Health Claims

Note: I hope this is like the ACL forum where there are more people reading it, but not posting questions/answers.  While I am enjoying my trip down memory lane – it is a lot of work and it would be a shame if I was the only one reading the posts.  My aim was to encourage discussion and sharing – this is not happening and lessens the value of the blog.  So post a comment, describe your experience, etc.

My early introduction into audit included the concept that audit was an early warning for management (this was before “independent assurance”).  It had the notion of identifying things that were going wrong and making useful recommendations (this was also before the idea of “risk”).  However, my belief was always that audit was there to help; and that the help could and should be offered to all levels of management.  Luckily, I did not see these as incompatible ideals; and to a certain extent so did my managers.

I remember often having discussions over who was audit’s “client”.  We reported to the Board – and they were the main recipients of our reports.  So they were a client.  Senior management also received the reports and responded to the recommendations – so they were a client.  But local management was the group being assessed and had to implement the recommendations – so this made them a client.    The issue was, the three groups had very different motivations and needs.  A high-level report was of little value to the local manager who need to fully understand the “cause” associated with the finding in order to be able to adequately address the issue; whereas senior management and the Board were more concerned with the impact.  Hence the ongoing debate of “who is our client”.

For a number of years, we actually produced three levels of reports.  The local manager detailed report with criteria, condition, cause, impact and recommendations; the management report which focused on the “what does it all mean” (impact and recommendation; and the Board report which presented an overall assessment.  In the end we were spending as much time writing the report(s) as performing the actual audit.

Your thoughts/experience on who is your client and how do you address the needs of your audience?

Auditors are often asked to examine fairly sensitive areas.  This can also mean that you have access to personal information.  Depending on your definition, this could be executive compensation, but in this case (for me) it was health claims.

Continue reading Year 19 – 2006 – Health Claims

Year 18 – 2005 – Quantitative Indicators of Risk – part 2

This is Part2 of an article on developing quantitative indicators of risk to support the annual risk-based audit planning process.

Part1 presented the concept that risk (Probability and Impact) can be measured quantitatively by looking at Complexity and Change (which increase the probability) and Materiality or Volume (which increases the impact).  It also encouraged you to look at more than financial risk.  Part 2 presents examples of indicators of risk and an approach that you can use to develop your own quantitative indicators.

The following are examples of data-driven risk indicators for various risk categories:

  • Financial – an entity that has multiple responsibility centers, a large degree of discretionary spending, and a high number of journal entries and suspense account transactions has a higher level of financial risk than one that has a single responsibility center and primarily non-discretionary spending (e.g. regular salary).
  • Operational – a production plant that has multiple production lines that produce both standard and customized products, requiring changes in the product line, has a higher operational risk than one with a single production line producing a standard product.
  • Legal – an entity that is highly regulated and subject to national, international regulations and has a higher level of ongoing litigation has a higher legal risk that one that is not regulated.
  • Technological – an entity dependent on rapidly changing technology has a higher technological risk that one that has a stable technological environment.
  • Environmental – entity that is highly regulated in an area that is subject to changing environmental regulations, has a lower level of organizational maturity and experience levels of staff, and high costs of non-compliance has a higher environmental risk than one that is not regulated or has minimal non-compliance costs.
  • HR – an entity that spans multiple locations and has full-time, part-time and casual employees – many with very little experience – has a higher level of HR risk than one that operates from a single location and only has full-time employees with many years of experience.

The data-driven indicators are relative – comparing the risk level of an audit entity to other entities (e.g. one activity or region to another).  The result is a data-driven relative risk ranking of each entity on each risk indicator and risk category.   The overall risk for each entity/activity can be assessed by combining the rating for all risk categories.  Thus, audit can identify entities with the highest financial or operational, etc. risk and the entities with the highest overall risk; or assess the effectiveness of risk mitigation efforts on corporate risks.

Data-driven indicators make the risk identification and assessment process easier to update, more responsive to changing levels of risk; and they support an analysis of the source of the risk.  Transactional quantitative indicators of risk can be viewed at any level or slice of the organization.  Auditors can drill down into a corporate risk or risk category to assess and compare every region, plant, division, project, etc.  The risk categories can also determine, for example, what is causing a higher level of legal or strategic risk.  In addition, during the development of the annual risk-based plan or the corporate risk profile, the analysis supports the conduct of more productive interviews with management.  It provides insights that allow auditors to ask questions that focus on the areas of highest risk to the specific audit entity (e.g. “Why do you have twice the number of journal entries and reversals as other financial managers?” or “What are your plans to address both the high existing HR vacancy rate and the large number of employees who are eligible for retirement within two years?”).   This can direct management’s attention to risks that might not have been known previously – making the risk discussion more valuable to both parties.

During the planning phase of an audit, drilling down into the data-driven indicators can focus the audit on specific risk issues (e.g. operational inefficiencies, emerging regulatory changes) or identify best practices.  For example, it is easy to examine the risk indicators for an audit entity to determine the factors causing, for example, HR risk to be high.  This can help shape the audit scope and objectives making the audit more effective and efficient.

Data-driven risk indicators can also be used on an ongoing basis to assess the risk associated with specific corporate initiatives (e.g. a proposed merger or acquisition) on all categories of risk not just financial.   For example, a quick assessment of the HR risk factors could identify emerging HR issues (high turnover and eligibility for retirement rates) in a company where a merger is being proposed.  Data-driven risk indicators can also highlight financial risks related to the proposed merger company’s current financial management control framework including highlighting a different financial management framework which may negatively impact the merger.  In addition, a potential merger’s risk indicators can be compared to previous mergers (successful and unsuccessful) to determine the relative risk and areas of highest concern.  This would better inform management decisions and risk management activities.

To support the risk-based plan, the identification of potential data-driven risk indicators should be considered for each corporate risk and for all risk categories.  Auditors should work with the Chief Risk Officer and subject matter experts to examine the risks; identify drivers that affect the risk; and develop data-driven indicators for each risk driver.   Table 1 is illustrative of the process to identify data-driven risk indicators for HR.  The same process can be used for each risk category (finance, legal and regulatory, etc.).  The first step is to define the sub-categories of risk (e.g. recruitment); then the associated risk drivers (e.g. lack of resources); and finally the data-driven risk indicator (e.g. increasing number of vacant positions).

Table 1 – Development of HR Risk Category Indicators

Risks Risk Driver Data-Driven Risk Indicator
Recruiting – failure to attract people with the right competencies. ·  Lack of resources

·   Lack of skilled employees

•    Vacancies

•    Acting appointments

Resource Allocation – failure to allocate resources in an effective manner to support the achievement of goals and objectives. •   Inappropriate resources for tasks

 

•    Employee type (full-time, part time, seasonal, contractor, etc)

•    Employee classification

•    Employee status

•    Unions

Retention – failure to retain people with the right competencies and match them to the right jobs. •   Demographics

•   Low experience levels

·   High turnover

•    Years of pensionable service

•    Average age

•    Average years in position

Work environment – failure to treat people with value and respect. ·   Unhappy workforce

·   High sick leave

•    Average sick leave/vacations

•    Percentage departures

 Once identified, the data-driven risk indicators should be categorized as indicators of volume, variability/change or complexity.  The same process would be performed on the other risk categories.

Since each risk category (finance, HR, legal, etc.) will have several risk indicators related to each of volume, variability/change and complexity, determining the overall risk for each audit entity will be difficult to do manually.  For example, you could have 7-8 risk categories (finance, HR, operations, legal, technological, etc.); with 5-10 risk indicators for each of volume, variability/change and complexity; and 20-50 audit entities for the annual risk-based audit plan totalling 700 – 4,000 risk measures.  However, the details allow you to look at risk from an overall, a risk category or even a risk factor perspective.  For example, you could easily determine that Entity A has the highest overall risk score, which is due to high risk scores in Finance, Operations and HR.  The HR risk is being driven by high variability (employee turnover and percentage eligible for retirement) and the finance risk is due to the complexity of the financial framework.  This will inform the planning phase of the audit of Entity A.  A similar analysis can determine which audit entities are having the largest impact on corporate risks.

While the details provide information to support the planning and conduct of an audit, the risk-based plan needs a higher level view of risk.  The solution is to develop a single composite data-driven risk score for each entity which includes all risk categories.  This is a multi-step process, the first of which is to develop a single risk factor score for each of volume, variability/change and complexity for each risk category; second, consolidate the risk factor scores into a single risk category score for each risk category (finance, HR, operations, etc.); and third, consolidate the risk category scores into an overall risk rating for each entity.

The data-driven risk ratings can be used to rank entities based on based on their overall risk.  In addition, qualitative and auditor judgment factors can now be included to arrive at a final risk rating. The final results can be sorted by risk ranking and audits assigned based on availability of resources.

The identification and assessment of data-driven key risk indicators can be accomplished easily and with minimal investment.  A data-focused approach will allow internal audit to identify issues, target risks and allocate resources more effectively.  It will support professional auditor judgment and make the annual risk-based audit plan more defensible, easier to update, and backed by quantitative and qualitative factors.  The data-driven risk indicators are useful during the interview process, aid the planning phase of individual audits, and can be used to keep the annual risk-based audit plan current.  The risk indicators can also be used to update corporate risk profiles, and assess the effectiveness of risk mitigation strategies and the risk associated with new strategic initiatives – providing valuable advice to senior management on all categories of risk.  Audit functions that leverage a quantitative, data-driven approach to identifying and assessing risk, are more relevant to the business and can provide more efficient and improved risk coverage to senior management and the Board.

 Examples of HR data-driven risk indicators

Volume / Size

·          Number of employees

·          Total dollars of payroll

Variability/Change

·          Average age

·          average age of senior managers

·          Average years of pensionable service

·          % of employee who can retire in least than 2 years

·          Experience – years in dept / position / classification

·          % fulltime employees

·          % positions affected by org change in last year

·          % employees in acting assignments

·          % new hires (within last year)

·          Total leave taken

·          Average sick leave taken

·          Average vacation leave take

·          Average unpaid leave taken

Complexity

·          # types of employee

·          # classifications of employee

·          # geographic locations

·          # unions

·          % employee with non-standard hours

Other

•       % by Gender (M/F)

•       % First Official Language (Eng/Fr/Sp/etc.)

Examples of financial data-driven risk indicators

Volume

·          Total Expenses

·          Total Revenue

·          Total Assets

Variability/Change

·          Percentage of discretionary spending

·          Percentage of expenditures in Period 12, 13+

·          Total and number of JVs

·          Total and number of suspense account transactions

·          Total and number of Reversal documents

·          Total and number of Losses

·          Percentage of A/P transactions paid late (> 30 days)

·          Percentage of A/R transactions more than 30 days overdue

Complexity

·          Number of Cost centres

·          Number of General Ledger accounts

·          Number of Foreign Currencies,

·          Number of Document types

·          Use of Internal Orders

·          Use of Purchase orders

·          Use of Fund reservations

·          Use of Materiel and Asset numbers

·          Use of Real estate blocks

·          Use of Work Breakdown Structure

·          Number of Employees

·          Number of P-Cards

ACL Commands: TOTAL, STATISTICS, CLASSIFY, EXPRESSIONS, and RELATE.  While the process used scripts to perform all the analysis, the commands were basic – such as Total Age 1 “Number_Emps” and then calculating the average age (Age / Number_Emps).

Lessons-learned: the analysis was extremely useful – particularly when discussing risks with managers.  We have the risk measures for each audit entity and could ask pointed questions of managers of projects or activities or ask senior managers about emerging areas of risk based on a comparison of previous years’ data.

“Built it and they will come” – is sometimes true, but I found that I often had to educate the auditors on how to review the results and drill down into the details to better understand the source of the risk.  To me it seems obvious – because I view a business process or activity from the data perspective – but this was not the case for all the auditors.  They would have a financial, HR or environmental lens and couldn’t see how the data helped.  Fortunately, with assistance, some were able to understand what the data was telling them about the entity/activity/process.

Using data-driven indicators of risk we were able to update the RBAP on a quarterly basis in hours.  This allowed us to ensure that we were dealing with the highest areas of risk and to identify emerging areas of risk early.

Year 18 – 2005 – Quantitative Indicators of Risk – part 1

This was my first attempt at identifying risk to support the development of the annual risk-based audit plan (RBAP).  I have been involved in the development of the RBAP – even responsible for it – over the years and always felt that it was more professional opinion than anything else.  Some people built a spreadsheet with weighting factors 1-5 and fooled themselves into believing that there is a logic and quantitative underpinning to the RBAP, but in the end, the auditors are providing the weighted scores based on professional opinion.

My approach was to use data analytics to support the qualitative aspects of the plan (auditor judgement, interviews with managers, previous audit results, etc.).  This was for two reasons: first, quantitative indicators are easier to update; and second they provide assurance that we were also considering emerging risks.

Below is part 1 of an article I submitted to the IIA magazine.  It was not published because they did not consider it to be “relevant to internal auditors” (????????), despite the fact that the IIA standards call for a continuous risk assessment.  I think that the reviewers didn’t understand the ease and utility of developing the data driven risk indicators.  I hope you find the article useful.

Developing data-driven indicators of risk to support the ongoing assessment of risk – Internal auditors face a daunting task of identifying and assessing risk.  The results of this activity are critical as they serve to ensure that scarce audit resources are being expended on activities that best address the risks identified by senior management.  The initial assessment of risk typically includes reviews of the corporate risk profile, business plans, financial statements, previous audit reports, and interviews with senior managers with question such as “What keeps you awake at night?”. The process can take weeks even months to complete.  Contrast this with the IIA standard #2010 which states that the chief audit executive must review and adjust the plan as necessary, in response to changes in risk, operations, programs, systems and controls and you can see where audit has a problem.

Continue reading Year 18 – 2005 – Quantitative Indicators of Risk – part 1