So there are there are several ways how the correct table names can be found. Basically it is a mix of techniques and Know-How that comes to that.
What will be defined as “relevant” depends on the analytic task or tasks which need to be performed. If we analyze the processes Purchase-to-Pay and Order-to-Cash fully, we need a set of around 200 tables for example. If we are only looking at Duplicate Payments, 8-10 tables may be enough.
Also you should be aware, that maybe not every aspect of the process is stored in a table. Take payments for example, which have been done manually, triggered outside the system, not making use of the Automated Payment Run in SAP®. One of the biggest risks there is, amongst the question, how it was authorized, is, where the money went to. Bank data of the payee is not recorded for those transactions, which makes the manual stuff causing a lot of fogginess and intransparency.
For our example “Payments to critical countries”, I will take the most obvious tables for a quick approach. They will be good enough to transport the message. Of course, going more into detail is possible; it just requires more data.
|Element / Process step||Table||Comment|
|Vendor masta data||LFA1, LFBK||LFA1 contains the global vendor master data; LFBK the bank accounts for vendors.|
|Customer master data||KNA1, KNBK||KNA1 contains the global customer masta data; KNBK the bank accounts for customers.|
|Payment run transactions||REGUH||In REGUH you can find the payees bank information, like bank account, bank key and bank country. This is recorded on level of each payment transaction that was done to such accounts.|
|Manual Payment Transactions||?|| Maybe you have tables in place, which contain the bank statements. There might be a chance to pull that information from, even for manual payment transactions.|
Picture 7 – Examine data structure: Table level
After identifying the tables, we need to discuss, which fields we do want to use for our analytic approach. This requires an understanding of how to answer the analytic question, and what exactly should be analyzed.
We are interested in “Payments to critical countries”. We found out, that in table REGUH we can have a look at the payment transactions and the payees bank information. Now it is worth to mention that the table in fact contains three country fields that could be interesting: One field is the named “LAND1” and contains the country with the residence of the vendor or customer; another one is “ZLAND”, which contains the country where the payee resides – keep in mind vendor or customer and payee can differ. A third one is called ZBNKS, and this actually holds the country key of the bank where the money was sent to. So which one do we want to compare to our list of critical countries, that we made based on CPI (Corruption Perception Index, or an embargo list like HADDEX)? Depending on your decision you would answer the question
- Which payments were made to vendors who are located in a “critical” country?
- Which payments were made to payees who are located in a “critical” country?
- Which payments were made to a bank account which is located in a “critical” country, no matter where the vendor or payee is located?
We could that example expand to the master data tables as well, but I guess you get the idea. Just like we stated that we should formulate our analytic questions in step one as detailed as possible, this now needs to be tied together with the data we want to access. Even if we decide to go for a comparison of field “ZBNKS”, we may still be interested in listing the other country fields for FYI reasons.
|Element / Process step||Table||Field(s)|
|Vendor master data||LFA1||LIFNR|| Vendor number|
|Vendor master data||LFA1||LAND||Vendor country of residence|
|Vendor bank accounts||LFBK||LIFNR||Vendor number|
|Vendor bank accounts||LFBK||BANKS||Vendor bank country|
|Customer master data||KNA1||KUNNR||Customer number|
|Customer master data||KNA1||LAND1|| Customer country of residence|
|Customer bank accounts||KNBK||BANKS||Customer bank country|
|Customer bank accounts||KNBK|| KUNNR||Customer number|
|Payment run transactions||REGUH||LIFNR|| Vendor number|
|Payment run transactions||REGUH||KUNNR||Customer number|
|Payment run transactions||REGUH||LAND1||Vendor country of residence|
|Payment run transactions||REGUH||ZLAND|| Payees country of residence|
|Payment run transactions||REGUH||ZBNKS||Payees bank country|
Picture 8 – Examine data structure: Field level
In addition to identifying the necessary tables and fields it is important to gather know-how about the data model and relationships between those tables you selected. As an example, a customer or vendor could have none, one or several bank accounts; this means that for one vendor or customer number there could be several entries in the tables KNBK and LFBK - or none at all, if payments are done by check only. Relationships between tables in a relational database are created by Primary and Secondary Key field relations. Without going into detail too much it is worth to mention that a basic understanding about such table relations and the data model are important as well to cover a topic thoroughly and belongs to the aspect of “Know your data”.
At the end of step three, there should be an as-complete-as-possible list of tables and fields which we will take for our (continuous) analytic approaches. It will not only contain fields that are absolutely necessary for the analysis, like the country fields. There will be a lot of fields we need to be able to work with the results afterwards, like what vendor or customer number is it, what is the name of the business partner, what is the name of the payee, etc. So that list might get pretty voluptuous, the more you go into detail. Please keep in mind, that because of data privacy aspects, but also because of performance and usability reasons, the principle of „data minimization“ should be considered: Find out what’s necessary, but stay focused when defining the data and avoid a big overhead.
If we take our 5 things as checklist, currently we managed to did the first two steps: