Robert Galetzka
Author: Robert Galetzka

Com­par­ing ACL™ Script­ing to SQL Quer­ies (Part 2)

The following blog post was written by our trainee Robert, who is studying business informatics at the Deggendorf Institute of Technology. During his internship he compared ACL™ scripting technology with queries created in SQL. Since the latest version of our extraction tool dab:Exporter offers SQL as target format as well, we wanted to get an overview regarding common features as well as differences of these analytic solutions. This is the second part of his article; the first part can be found here.

ACL™’s analytic solutions do have an own (and pretty cool) programming- / scripting language where reusable analytic routines can be created. Many companies do not use only ACL™ solutions but also other software for data analysis in their company. Often data is stored in SQL databases. These data could easily be imported in ACL™ and analyzed there. But an interesting question is how such analytic routines which can be created and performed in ACL™ would look if they would need to be done directly in the database, using SQL statements.

I am working as part of my internship at the dab: GmbH on the comparison of ACL™ with SQL by converting existing ACL™ analytic routines (“Scripts”) to SQL queries. In this blog post I will give an insight on the approach in general as well as the implementation/conversion and the differences between SQL queries and ACL™ scripts which I encountered.

For comparing ACL™ scripting and SQL queries I looked at an existing solution which is coded in ACL™ and started porting it to SQL. This is the second part of my project report containing the most important differences I encountered.

II. ACL™ vs. SQL – The key dif­fe­ren­ces I en­coun­te­red

To give a little spoiler here: I successfully managed to create the SQL prototype, and to fulfill all conditions. However it I was not able to find an equivalent SQL command for all the feature ACL™ offers, so I had to find some workarounds for solving some challenges. In this chapter I will look at the key differences in detail:

  1. Working with multiple tables: The powerful ACL™ feature “Relations” does not exist in SQL (see part 1 for details)
  2. ACL™ allows you to differ between column names and column descriptions, SQL does not (see part 1 for details)
  3. The „Cross tabulate“ command in ACL™ does not have an as convenient equivalent in SQL
  4. Entering user-defined parameters is easier in ACL™ using the built in „Dialog“ feature

After explaining the key differences I will make a short general summary. The first two bullet points of the list above have already been covered in the first article - in this blog post I will talk about the points 3 and 4.

3. The „Cross ta­bu­late“ com­mand in ACL™ does not have an as con­ven­ient equi­va­lent in SQL

In the ACL™ script which I ported to SQL, the CROSSTAB command is used aggregate data similar to a Pivot table. This command allows multiple fields to be selected as criteria describing a row, and one field to be selected as aggregated column criteria.

To implement the same (or even similar) aggregation in SQL turned out to be tricky. I had to use dynamic SQL to achieve a comparable result.

Let’s have a look at an example:




We want to get the total amount for each customer in one line, differentiated by document type. Each result column name should be enhanced by the document type to provide a clear result structure and naming convention:


DocumentType_X and Amount_X and Quantity_X


X is a placeholder and represents the document type A, B or C.

The result should look as follows:


10001001025050 0


In ACL™ this can be done easily using the CROSSTAB command, which is only one line of ACL™code:

CROSSTAB ON Customer COLUMNS DocumentType Subtotal Amount Quantity to tCustomer_DocumentType_Crosstab COUNT OPEN


The equivalent to ACL™s CROSSTAB is the PIVOT function in SQL, however this requires some more work to reach the same result. In a first step I need to create appropriate column headers, whereto I need to attach the values in a second step.

The following code shows how I predefined the column headers and added the corresponding document types to it; then I assigned the this to a variable named @cols. The result is a single string containing the required column names. Should additional document types be added, then then this has to be taken into account in the query as well. Therefore, a “dynamic query” must be chosen rather than a static one.

Select @cols = STUFF((SELECT ',' + QUOTENAME(col + '_' + cast(DocumentType as varchar(4)))

from BasisTabelle

cross apply


select 'c_DocumentType, 0 union all

select 'c_Amount, 1 union all

select ‘c_Quantity,1 union all

) c (col, so)

group by col, so, DocumentType

order by DocumentType, so


).value('.', 'NVARCHAR(MAX)')


The string contains the following elements:

[c_DocumentType_A],[c_Amount_A],[c_Quantity_A],[c_ DocumentType _B],[c_ Amount _B],[c_ Quantity _B],[c_ DocumentType _C], [c_ Amount _C],[c_ Quantity _C]


There is a set of columns now for each document type, with the document type value attached to the column name.

In a second step I have to take care of the aggregated values like total amount and total quantity. The values are retrieved from the prepared BaseTable and then assigned to the columns. The content of the @cols variable is passed on at the beginning of the SELECT statement so that the required columns are created, plus at the end of the query in the IN statement where the values are assigned to the respective columns. Because the query is stored in a variable as a string, to execute it the command "exec sp_executesql" is needed; by doing so, the executable string gets passed on to the stored procedure and executed accordingly.

set @query

= 'SELECT Customer, ' + @cols + N'
select d.Customer,
   new_col = c.orig_col + ''_'' + cast(d.DocumentType as varchar(2)),
select Customer , DocumentType,

c_DocumentType = cast(c_DocumentType as varchar(50)),
c_Amount = cast(sum(c_Amount) over(partition by Customer, DocumentType) as varchar(50)),
c_Quantity = cast(sum(c_Quantity) over(partition by Customer, DocumentType)as varchar(59))

from BasisTabelle

) d
cross apply
  select ''c_DocumentType, d.c_ DocumentType union all
  select ''c_Amount, d.c_ Amount union all
  select ''c_Quantity, d.c_ Quantity union all

) c (orig_col, value)

) x
  for new_col in (' + @cols + N')
) p'

exec sp_executesql @query;

After the command was transmitted we obtain the desired result as Pivottable in SQL:

Figure 1 – Pivot-Table in SQL (ACL™ CROSSTAB equivalent)

4. Entering user-defined parameters is easier in ACL™ using the built in „Dialog“ feature

When creating the dab:AuditObject, parameters are used which help filtering or limiting the data. This is done in ACL™ by providing an user dialog, where the parameters which a user enters are takin into account dynamically. My concrete task was to use parameters like the company code, posting date and language key and the option to in- or exclude intercompany transactions. An additional requirement was that to be able to input a course type and a reporting currencyto allow automated currency conversions. Doing this in ACL™ is very simple, as a dialog box is a built-in feature of ACL™s scripting language. Providing the dialog and using the input parameters as variables all happens in only a single line of ACL™ code.

Figure 2- Example dialog in ACL™ (built-in feature)

For SQL however achieving the same is not so easy, as SQL does not contain any commands that enable the creation of graphical elements or GUI parts like a dialog box . Of course, it is possible to set filters in the code by using variables or WHERE statements. If you want to create a query which only returns records complying with a certain restriction, for example displaying texts from table DD07T only in German language (where the field DD07T_DDLANGUAGE has the value “D” for Deutsch (German) ) with the DOMNAME = VGABE, it can be achieved using the WHERE clause:



Figure 3 – Result of SELECT within WHERE clause

Yet this is only one query – now imagine you need to filter several tables in an SQL script for that very same language key, then this value would be hardcoded all over the script code. To adjust it, like changing the value if you only want to display English texts, each value must be changed separately. A better way of working is to use variables instead. You initiate them at the very beginning, and this is the one single place in the code where the value is set. In SQL this can be done by using DECLARE to initiate a variable. The variable name has to begin with “@”, followed by a declaration of data type and length. SET assigns the desired value to the variable then.


DECLARE @setLanguage AS VARCHAR(3);
SET @setLanguage = 'E'


By doing this our variable can be used in several places, but value changes can be done centrally. The same approach I showed for our language key can of course be used ACL to filter by company codes, date intervals and so on.

However, adding and changing parameters directly in the code is not very user friendly – having a user dialog would be a lot nicer. It allows you to enter the values easily without getting in touch with the actual script code. But as SQL does not have a feature like this, I needed to use another programming language. I decided to go for C#.

With C # it is possible to connect with the database, where the data is, and to execute SQL queries in that context. For this purpose, a SQL script is stored in a file which gets opened, read out and executed.


Connecting to the database with Windows authentication looks as follows:

SqlConnection ConnectionString = new SqlConnection("Data Source=(local)\\SQLEXPRESS;" + "Initial Catalog=dabTest_DB_01;" + "Integrated Security=sspi");


We need to have the path ready where the script file is stored:

FileInfo File_Query1= new FileInfo(@"C:\BeispielOrdner\Query1.sql");


The file is accessed and the script is read out and stored in variable as string:

string Query1_string = File_Create.OpenText().ReadToEnd();


With ConnectionString.Open () the connection to the SQL server is established. To be able to run the query, the query code which was read out that file must be handed over to a “SqlCommand” in context of the currently active connection. After doing this the query can be run be executed. The SqlCommand object which is required needs to be created beforehand though.


This part of the code opens the connection and creates the SqlCommand object:

SqlCommand Command_Query1 = new SqlCommand(Query1_string, ConnectionString);


The object “Command_Query1” now includes the query as string and has a data connection to the SQL database. Now the only thing left is to actually execute the query:



After the script is run, we close the connection:



Now the only thing left is the creation of the dialog box. Using Windows Forms we easily can design one. I made the layout and all GUI elements similar to the existing ACL™ solution:

Figure 4 – C# dialog of input parameters

If parameters are entered, they need to be assigned to parameters. This is done by using the “SqlParameter” object. Let’s assume our SQL query contains the following part:




The variable @Language should have a value according to the users input. To achieve this, the value gets stored in a string, which will then be passed on to the variable:

string Language_string = textBoxLanguage.Text;


In order to assign the value to the variable, a new parameter was added to the SqlCommand.

Command_Query1.Parameters.Add(“@Language“, SqlDbType.VarChar,3).Value = Language_string;


Now we can run the SqlCommand Object to execute our query using "ExecuteNonQuery ()" - but this time including our parameter.

III. Con­clu­sion

Yes, there are differences between ACL™ vs. SQL. In fact, the ACL™ script language is more powerful as it may look on the first eye sight. Most of features that ACL™ script language offers I could port to SQL – however sometimes bigger workarounds were required to achieve comparable results. Still sometimes it is necessary to compromise when porting ACL™ code to SQL, for example when it comes to user-friendly column labels or the missing RELATION command. Creating dialogs was more flexible in my solution, however a second programming language (C#) was needed, plus a lot more code was required to achieve the same thing.

The following table proves an overview in a nutshell:



TopicACL™ ScriptSQL ScriptProblemSolution
Working with multiple tablesACL offers besides the classic JOIN the possibility to work with "RELATIONS".SQL provides only the JOINS. These are in a variety of tables, slightly confusingACL™ has a better access on the data through its relations. The relation selects only one matching record while SQL is selecting every matching connection.You can sort the data ascending or descending and with SELECT TOP 1 you get only one matching record
Column namesACL™ provides technical column names and column headers.SQL can only display one column name.Technical names like EKPO_EBELP are hard to read and understand in SQL, as there the alternate column titel is missing, which would make it easier to understand.After creating the AuditObject the technical name is put in brackets and the title in front of it.
Dynamic creation of crosstabsUnder ACL ™, the crosstab is created with CROSSTAB on an expression.The same can be created in SQL by Pivot, but not as easy and not as dynamic.SQL should provide the same as ACL™s CROSSTAB: Only columns with values which exist in the database and dynamically adapt to changes.Through dynamic SQL and variables you can avoid static queries.
DialogueACL ™ provides with its own scripting language elements to create a dialogue.SQL includes no way to build up a dialog box.If you now wish filter variables filled by user input in SQL this is not possible out of the box. You need to code these instead of having readymade graphical elements.A dialog box can be created with C#. It reads the user input and passes it to the variables in the SQL script


For any comments on this article, feel free to write us at info@dab-gmbh.de 

Comments (0)
Be the first who comments this blog entry.
Blog login

You are not logged in. Please log in to comment this blog entry.

go to Login