27.05.2021
Ernst-Rudolf Töller
Author: Ernst-Rudolf Töller

A guest post on the topic of data analysis and auditing: an unequal pair?

Today's blog post is a guest contribution by our long-time companion in data analysis, Mr. Ernst-Rudolf Töller. 

Mr. Töller worked as an IT auditor for auditing companies for more than 25 years, during which time he was intensively involved in data analysis issues. He worked for BDO AG, Hamburg until his retirement at the end of February 2016 and has since been in active retirement. During his work as a "data scientist", he was involved in audits of annual financial statements as well as in investigations of fraudulent acts or projects to support internal auditing. His focus was and still is on mathematical-statistical models for data analysis and their implementation with programs such as ACL and WINIdea.

 

The question of what he thinks should change in the work of audit firms is answered by a manager of an international company as follows: 'back to basics'. When asked, it turns out that this of course does not mean auditing the way it was done decades ago. Rather, in the view of the interviewee, the major challenge for auditing manifests itself in two opposing poles:

  • Business processes in the company are producing more and more digital data in ever greater detail. On the basis of this data, each individual process (or part of it) can be traced more and more precisely, even in retrospect.
  • This is contrasted with annual financial statements in which these data volumes are aggregated in ever greater and ever more complex ways. Both the large volume of data and the complexity of the rules, for example in consolidations, play an important role here.

As high as transparency is now in individual cases due to the growing amount of digital data, the more impenetrable the 'big figures' of a company or an entire group often appear in contrast. Even classic auditing tools such as prior-year comparisons reach their limits when group structures change permanently from one fiscal year to the next. In such cases, extensive additional calculations may be required simply to ensure that the figures are comparable with those of the previous year.

At the upper end of the scale, therefore, what is referred to as 'identification and assessment by the auditor of risks of material misstatement in the preparer's financial statements' can be correspondingly difficult. Even with digital audit tools, it can be problematic to clearly identify critical parts of financial statements directly from (relatively high) cumulative data.

In addition to the analysis of aggregated accounting data, instruments such as 'journal entry testing' have also become established in recent years - at the other end of the scale, so to speak. The aim here is to identify critical cases directly in the individual documents. In practice, the problem often arises that such 'simple' queries of data generate large quantities of 'false positives'. Subsequent manual clarification of the individual cases can then be very time-consuming.

So what exactly could 'back to basics' be about? The manager interviewed specified his thesis as follows: In the audit, units/objects should be focused on, of which an auditor knows, based on knowledge and experience or intuition, how they can be plausibly represented in figures. Examples:

  • Assessing the entire inventory of one or more high-bay warehouses in terms of quantity and value is difficult if only because such orders of magnitude simply escape our imagination. People do not usually think in such dimensions. The situation is quite different with a single storage compartment in such a warehouse. Expert knowledge coupled with everyday experience is very well suited for making important plausibility considerations for the contents of a single storage compartment with regard to corresponding figures such as quantities and values (dimensions, volume, load-bearing capacity, type of material, etc.).
  • The totality of all stores in a large retail chain is also difficult to assess in terms of the associated economic variables (inventory, sales, profit, seasonality, etc.). The situation is quite different with a single store. Here, there are also important plausibilities for the figures of an individual store, which can certainly be assessed with a mixture of business management knowledge and 'common sense'.

The problem is that in many cases there is no simple reconciliation to use the knowledge available in the individual case also for the assessment of an overall portfolio.

The conversion of 'raw data' into representations that can be used for auditing purposes is an elaborate step. Here are just a few details:

  • Current ERP systems are based on differentiated data models. Data analysis must take these models into account comprehensively, not just a few 'standard attributes'. This requires suitable mathematical models.
  • The semantics of accounting is sufficiently different from the internal semantics of an ERP system. A significant transformation process is required to convert the ERP view into the external view of accounting.
  • Data analysis must produce new types of results, especially graphical representations. An example is the selection of cases based on their location in a data cloud ('lasso technique'). A corresponding direct data query (SQL or similar) would have a complexity that would hardly be manageable.

Requirements of this kind can only be implemented using standard software. Anything else would either be uneconomical or fall far short of the requirements.

Findings from individual cases cannot be 'up-multiplied' so easily and the quantity structure is often too large to do a complete individual case review. The decisive factor that must be added today in order to use such knowledge and experience, which was originally only useful for individual case audits, on a large scale is digital data analysis. With such means, it is possible to bring the totality of storage compartments, branches, etc. to the auditor's workplace to examine them with appropriate digital audit tools. This is not about the totality of basic data for a fiscal year, which may actually be available at a single workstation due to the amount of storage space available. Rather, it is a matter of being able to display the 'landscape' of the individual units (storage compartments, branches, etc.), for example, using graphical preparations. Existing auditor knowledge can then be used to explore this landscape: Identify clusters, assess the location and size of those clusters, or identify extreme cases outside of clusters, etc.

So, in the end, this is what was meant: the auditor should not apply his knowledge directly to the real individual case, but examine a virtual totality of business entities by means of digital data analysis. This means, however, that auditors must incorporate the options for examining digital data, which are now available, much more comprehensively into their activities. However, this can only be done by using qualified standard software in combination with the expert knowledge of an auditor. Only this combination of specialist knowledge and analysis software is capable of providing a benefit.

The audit must be connected to the currently possible ways of data analysis. Especially when - as in the current situation - the basic questions of auditing such as the authenticity of figures and data are at stake, it is crucial to use all options for the plausibility check of figures.

In this sense, then: 'back to the basics'.


Comments (0)
Be the first who comments this blog entry.
Blog login

You are not logged in. Please log in to comment this blog entry.

go to Login