Skip to main content

Transferring data with Data Transfer packages


Christian Hauggaard
Community Manager
Forum|alt.badge.img+5

In this article, you are going to transfer data from an external web service to your local TimeXtender Orchestration and Data Quality database. Specifically, you will read about:

Data Transfer Packages can be used to transfer data from any Data Provider to an SQL Server destination. Common usage scenarios are:

  • Staging data from a web service to be able to limit requests and join between multiple endpoints
  • Create a historical table for later use

The Data Transfer Packages support every Data Provider as the source and can be quickly deployed.

Create the Data Transfer (v. 24.3 and older)

Create a new Data Transfer Package by right-clicking Packages and selecting New Data Transfer

 

Setting up the Source

All Data Providers can be used as a source in Data Transfer.

  1. Start by selecting the appropriate source Data Provider. In this case, we are talking to a web service and select PowerShell as the Data Provider.
    When selecting SQL Server or Oracle as Data Provider, you are able to select a preexisting table or a view from the Table dropdown.
  2. Custom can be used to write specific queries against the data provider.
  3. After selecting a table or writing a custom query, click Preview to see the data.

In some cases, you might want to run a pre-or post-execution script on the source data provider. To do this, click Pre/post execution to enter those scripts.

Configuring the destination

All SQL Server Data Providers are supported as destinations for Data Providers.

  1. Select a Data Provider and a destination table.
  2. If you have not already created the destination table, you can use Create table to create the table.
  3. If you are using the TimeXtender Orchestration and Data Quality database as a destination, it's most common to use the client or staging schema.
  4. Here, you can also set a Pre / post-execution script to be executed before or after the transfer. This can be useful for merging to final tables. Finally, click Create

 

Column Mapping

The final step is to set a mapping between the source and destination data. TimeXtender Orchestration and Data Quality will try to autocomplete this mapping if the names are the same.

Create the Data Transfer (v. 25.1 and newer)

We are harnessing the power of merging by introducing staging tables in our Data Transfer package that O&DQ has been offering. This feature enables the ability to update destination data if it was changed in the source.
By default, we will be enabling merging for all new tables. Existing tables will not be affected until the feature is enabled as outlined below.

To create a new Data Transfer package and enable merging follow the steps below.
If you already had created a package and would like to enable the feature, skip to step 3.

1. Open the Package context menu and create a Data Transfer package.  

Creating the DT package in O&DQ


2. Select the Data Provider and Source Table or the Custom option 

Source tab of the DT package


3. Select the Destination tab.
Select the Data Provider and Table. The first new elements detailing the merging options can be found just below.

Destination tab


Enabling usage of the staging tables (and therefore the merging) is a simple as clicking on the checkbox.
Creating a new package or duplicating an old DT package will enable merging by default.


This results in enabling the schema and table name controls below.

Destination tab with merging enabled


Using the two text boxes the name of the schema to be used for the staging table and its name can be adjusted.

The default naming scheme is as follows:
[dbo].[StagingTable]

Once you save the package a validation will run to verify if that schema name and table name are not in use. In addition, if needed,  the database user will be verified for schema creation permissions.
The staging table will be reused and truncated during every execution. An audit table will be created alongside this staging table and will contain information on the process execution and will also record errors.

4. Select the Mapping tab
The new table columns added in this feature are Action, Custom Value, and Collation.

The Action column enables user to make the choice to:
Ignore - do not make any changes to this column
PrimaryKey - use this column as Primary Key in the merge process
Update - update this column if the Primary Key matched and the source data has been changed

The default action on loading and resetting the mapping is Update.

Action column options

The Custom Value column accepts strings, integers and basic SQL statements such as GETDATE().
The SQL statement will be evaluated only during the merge procedure. The results will appear in the destination table.
We have also added a Collation column. You can set it to be used when the data is selected from the source table.

Mapping the columns

In case merging is not enabled then these three columns are also not editable and will return to their default values that will not affect the class Data Transfer behavior (Update, empty, empty).
If you had previously enabled merging and disabled it again: the Action column will revert to Update state, while the Custom Value and the Collation columns will revert to the empty state. Unless you save the package this will be revertible.

5. After setting up the Mapping tab follow the standard procedure to save, deploy and execute the Data Transfer package.

What will be the result of running this package?
Original Data Transfer / No Merging: Data from the source is directly copied to the destination in its entirety. There is no data updated, replaced or omitted.

Data Transfer with Merging: Data from the source is merged to the destination via a staging table:
- 1st run: All data is copied to destination.
- consecutive runs, only new entries appeared in the source: Only new entries are transferred to the destination
- consecutive runs, existing source entries got updated: Existing entries in the destination table get updated to match the source,
- consecutive runs, some source entries got updated and new entries appeared in the source:  Existing entries in the destination table get updated to match the source, New entries are inserted

Destination data does not get removed or table truncated.

Next steps

The data transfer is now ready. The next steps could be:

Did this topic help you find an answer to your question?

0 replies

Be the first to reply!

Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings