Skip to main content

Theobald Xtract Universal (XU) offers SAP data over http-json endpoints. XU can be leveraged to extract data from SAP into an Ingest instance using a REST data source.

Benefits

  • Schedule executions for all SAP entities (including BAPI, BW Hierarchy, ODP, Query, DeltaQ, etc.) from within TimeXtender. XU supports all SAP entities, unlike the legacy TimeXtender SAP Table Data Source which used Xtract IS, and only supported TimeXtender-scheduled executions for full table extracts.
  • XU does not require SQL Server Integration Services (SSIS), unlike the legacy TimeXtender SAP Table Data Source which used Xtract IS.

Prerequisites

Steps

1. Install and Configure XU

Install XU on the TimeXtender application server. Read and follow the steps at 

2. Create Extractions in XU

Read and follow the steps at

You can use all available extraction types that XU offers.

Note: Make sure to configure all extractions you want to use with the rsd generator tool have the default extraction type of http-json if you want to use the generator code without modifications.

3. Download and unzip the xu-rsd-generator utility

Download the latest version of the xu-rsd-generator utility.
Save and extract the files on the local drive on the TimeXtender application server.

Note: Due to the number of files that will be generated by python it is recommended to not have this folder on OneDrive.

4. Execute install.bat or follow the install instructions from README.md

This will set up python dependencies so that the generator can create rsd files.

5. Edit parameters in xu_rsd_gen\.env

  • Adjust XU_BASE_URL to point it to the XU server and port. If you installed XU on the TimeXtender application server then the XU server is simply localhost, and the default port for XU is 8065. 
  • Point RSD_TARGET_FOLDER to a local folder that you will use for the TimeXtender REST data source.

6. Execute run.bat

Executing run.bat will create the RSD files in the folder defined under RSD_TARGET_FOLDER.

7. Create REST data source

  • Add a data source in the TimeXtender Portal. Select REST from the dropdown.
  • Set the Location property under the Schema section to the RSD Target Folder directory.
  • Map the data source to the Ingest instance. For more info see this guide on how to create and map data sources.

8. Add REST data source to an Ingest Instance in TimeXtender Desktop

Read and follow the instructions here: Configuring a Data Source in TimeXtender Desktop

9. Verify the Tables in the Data Source Explorer and Query Tool

Limitations

1. Incremental loading

If you need to load tables with a large amount of records:

  1. Use XU to generate a full load of the long table into a parquet file (make sure the Package size property is set on the extraction level, otherwise the extractions can consume extreme amounts of memory on the SAP source system)

  2. Configure a timestamp based sliding window for data extraction, either in the RSD files, or on the XU extractions directly e.g. each load will get the last 2 days worth of changed data. This is the equivalent of subtract from value feature as described in the Incremental Load in an Ingest Instance article. 

  3. Combine the static full load parquet data source with the sliding window/incremental data source in the MDW table.

2. Pagination

XU does currently not support pagination of long tables. During testing we found that even for millions of records this did not cause a performance problem neither on XU nor on the TimeXtender Ingest Service. Make sure the Package size property is set on the extractions, otherwise the extractions can consume extreme amounts of memory on the SAP source system. If you are concerned with large tables being streamed over http-json, consider the using the approach described in point 1 above regarding Incremental loading.

Question: if you want to use DeltaQ (to extract BSEG + BKPF incrementally for instance) does this require anything special wrt to the process for using the DeltaQ in 20.10.x ODX Server?


@rory.smith I haven’t had a chance to test this with DeltaQ yet, however, it should work pretty much the same like before.

There is now also CDC implemented by Theobald; I’m looking forward to test/verify that as well.


@fwagner Can you explain in a little more detail how to do incremental loading with Theobald. I have my full load running for table VBRK. In XU, I created another extraction called Incremental_VBRK that gets all records from -2 days till today. I’m not sure how to properly combine these two different tables into one where the records are only updated or appended. I feel that the documentation about incremental loading doesn’t fully apply when using XU but I might be wrong..


Reply