3

I am using SSIS packages to extract data from SAP database tables into SQL Server tables. I am using OLEDB source/destination connections to achieve this.

The problem now is that a table in SAP has 5 Million records and its taking around 2 hours to extract this data into my SQL Server table. I have used the trunc-dump method (truncating the table in SQL Server and dumping data into it from SAP table) and also tried using Multiple Hash key to bring in the updated/new records.

The problem with Hash key is that it still has to scan the entire table to look for changed/new records and hence takes almost the same time as the trunc-dump method.

I am looking for a new way or changing the existing way to reduce the time taken to complete this extraction.

Sandra Rossi
  • 11,934
  • 5
  • 22
  • 48
Rahul
  • 903
  • 1
  • 10
  • 27
  • 1
    Do you have access to SAP table? Is it possible to add WhenUpdated datetime field onto it and then filter/refresh data using this field? – Vitaly Borisov May 15 '19 at 21:33
  • 1
    If you can, have an index on the `WhenUpdated` field, too. – Brian May 16 '19 at 19:04
  • @VitalyBorisov SAP is a proprietary software and if we make changes to the SAP standard tables we lose SAP support and i don't believe my company is ready to make that change. – Rahul May 17 '19 at 12:28
  • Does SAP allows to build any logic inside it? Like in Dynamics? A trigger? – Vitaly Borisov May 17 '19 at 19:25
  • @VitalyBorisov this goes back to my answer for your first suggestion.I wish i could make any such changes to the tables but i can not. I dont believe Change data capture is a possibility. Do you have any other ideas? – Rahul May 20 '19 at 15:33
  • Look into CDHDR and CDPOS tables to see which records are changed – Vitaly Borisov May 20 '19 at 20:05
  • @rahul, SAP is a software company, not a software. Which SAP software do you use, SAP ERP, S/4HANA, C/4HANA, BW or what? You seem to access the SAP database directly, which RDBMS is it? – Sandra Rossi Jul 01 '23 at 17:51

1 Answers1

2

As you mentioned you were using OLEDB source connection to access SAP, if that means you were accessing SAP's underlying database directly, you should pause doing that for three reasons till there are explicit IT approvals:

  1. You skipped SAP's application layer security. There can be an enterprise security compliance issue;
  2. Your company's SAP license may not allow you to do that. If your company only has SAP indirect access license, then you may have to stay on application layer;
  3. You will not get SAP's official support by accessing the underlying database directly.

You have multiple options to fetch data using SSIS through SAP application layer:

  1. Use commercial SSIS custom components for this job (disclaimer: AecorSoft is one of the leading vendors offering such connectivity components);
  2. Look into SAP's own OData Gateway interface to consume data.
  3. Request your SAP ABAP team to write custom ABAP programs to dump SAP data into CSV files, and then use SSIS to fetch them.

Let's now look at the performance side:

SAP ETL Performance depends on many factors, but in general, even for the SAP transactional tables with 100+ columns, it's considered very slow to extract 5 millions rows per a couple of hours. For example, we've seen cases of extracting standard SAP General Ledger header table BKPF (almost 100 columns) at consistent performance of 1M rows every 1-2 minutes. Of course such performance is achieved through commercial component and SSIS, but you should expect at least 1M per 10 minutes even for the #3 option above, going through an intermediate CSV file. Under the hood, through SAP application layer, all the 3 options would leverage SAP Open SQL (in contrast to the "Native SQL" which the underlying database offers) to access SAP tables, therefore, if you experience application layer performance issue, you can analyze the Open SQL side.

As you also mentioned about update/new records scenario, it's a typical delta extraction problem. Normally, in SAP transactional tables, there are Create Date and Changed Date fields which can help you capture delta. In this case, in order to avoid full table scan, apply indices through SAP application layer on those "delta fields". For example, if you need to extract Sales Document Header VBAK table, you can filter by ERDAT (Created on) and AEDAT (Changed on). Delta is a complex subject in SAP. There is no simple statement to describe the delta solution, as SAP data models are complex and very different across functional modules. The delta analysis is always a case-by-case effort. Some people may also simply recommend using "delta extractors", but don't treat that as silver bullet, because extractor has its own problem. In short, if you look into table based extraction, focus on that, and try to work with your SAP functional team to determine the suitable delta fields. Try avoiding doing full table scan and hashing. Do incremental load with some optional overlap of previous extract (e.g. loading today and yesterday's records), and do MERGE to absorb the changes.

There are few cases you may not be able to find any delta field, and it is not practical to do full load all the time. One great example is the Address Master data table ADRC. In this case, if you are required to do delta load on such table, you ether have to request your SAP function team to figure out delta for you (meaning they inject custom logic to every place where Address master can be created, updated, or deleted), or you have to request your SAP Basis team to create DB trigger on the underlying database table, and expose the trigger table at application layer. This way, you can create an application layer view on the main table and the trigger table to do delta. Still, there is no direct database access through your solution. The DB layer trigger is fully managed and controlled by your SAP Basis team who also supports the database.

Hope this helps!

AecorSoft
  • 414
  • 4
  • 10