This article describes how to setup incremental loading for an Ingest Instance. See Incremental load in Prepare Instances for more information on setting up incremental loading in a data area within a Prepare instance
Setup incremental loading in an Ingest instance
In Ingest instances, the incremental load setup is based on rules that are set up to apply for specific columns. When there is a “hit”, incremental load can be applied, only loading data where the value is larger than the maximum value for the previous load.
Before you set up these rules, we recommend that you set up the primary keys on the tables that will be incrementally loaded. See Set up Primary Key Rules on an Ingest Instance Data Source for more information.
To add an incremental load setup rule, follow the steps below.
- Right-click the data source and click Set Up Incremental Load. The Incremental Load window opens.
- Click Add… to
- In the Add Incremental Load Setup Rule window, enter the Schema, Table and Column that contains the value the incremental load should be based on. For each item, you can choose different comparison options: contains, equals, not contains, not equals and case-insensitive variants of each. This makes it easy set up a rule that applies across the data source if all tables contains the same useful incremental column, e.g. “ModifiedDate”.
- Subtract from value: This option allows subtraction from the field to which the rule applies. This feature is useful for data sources where the modified date is a Date field instead of a DateTime field. Additionally, it can be utilized for data sources with a created date but no modified date, where incremental loading would improve performance. In such cases, incremental loading based on the created date with an offset allows updates to occur on rows recently created, provided the changes happen within the defined interval. The Amount specifies the decrement to apply, which could be on a timestamp or similar field. Time defines the units—Seconds, Minutes, Hours, Days, Weeks, or Years—that will be subtracted from the last added DateTime.
-
If rows can be updated or deleted in the source table(s) you are loading from, the settings under Additional actions can keep the tables up to date without regular full loads. Select Handle primary key updates to update rows in the Ingest storage when a row with the same primary key is loaded from the data source. Select Handle primary key deletes to remove rows from the Ingest storage that has been removed from the data source. If you enable any of the features, a primary key (PK) table or folder will be created where the primary keys are stored for comparison to the source for updates or deletes.
-
Click OK.
The rules are applied on execution. To see what tables will be affected, click Refresh and review the bottom list in the Incremental Load window.

If you’ve enabled ‘Handle primary key updates/deletes’, TDI will check if the affected tables has a primary key set. If not, the tables that are missing primary keys will be listed in the Incremental Load Primary Key Validation window. Select the primary keys for these tables, then click OK.

Set up incremental load for REST data sources
Performing an incremental load on a REST data source is different from doing so on a traditional database. It is not as simple as tacking on a general WHERE-statement as you could do with a database data source. Each REST data source has its own requirements and behavior.
In order to support incremental on any REST data source, we need a mechanism to create the HTTP request that the data source requires. For the TimeXtender REST data source that is accomplished by using dynamic values. That way we can place the last maximum value at any point in the request that we need.
The REST API flavor of incremental load does not support handling of updates and deleted, so periodic full loads maybe be required to keep data completely in sync with the data source.
A quick refresher on dynamic values: Dynamic values are placeholders that are replaced at runtime with values from different sources (another endpoint, a SQL query, etc.). These values can be injected into several places, for instance in the path, headers, query parameters, post body and even the table builder.
Setting up a REST data source for incremental load is a two-step process. The first step is to select what columns to use for incremental in TDI and the second step is to use them in the data source configuration in the TDI Portal as dynamic values.
Step 1: Selecting columns in TDI
To select the columns, follow the steps below.
- Right-click the data source and click Set Up Incremental Load. The Incremental Load window opens.
- Click Add to add a new incremental value column. Unlike the regular functionality, these columns are explicitly selected for each table, not based on rules.
- (1) is the endpoint name.
- (2) is the table name.
- (3) are the available columns. Only columns that can be used for incremental load will be listed.
- Under Incremental load rule, the options for the selected column are set:
- Subtract from value: A predefined constant with be subtracted from the last maximum value. For a column with the ‘DateTime’ data type, this is a time value and the granularity can be changed in the right-hand dropdown.
- Dynamic value name: The dynamic value that has to be used in the data source configuration in order to use the last maximum value. This is the core of how we drive the incremental load for REST data sources. In the TDI Portal this will be the placeholder for the dynamic value and should be written in brackets like, e.g., “{TXINC_StandardDataTypeResponse.DateTimeData}”. If you have a subtract value set, the subtracted value is the value we will get in the dynamic value. The name can be changed, but if it is changed it must also be changed in the TDI Portal where it is used.
- Full load value: The value that will be used for the dynamic value when a full load, or import metadata, is executed. In a incremental load setup for a database data source, it is easy to do a full load - just remove the WHERE-statement from the query. When doing incremental on a REST data source we need a value to include in the query since sending an empty value wouldn’t work.
- Date time format: How to format the datetime when using it in the data source. Only displayed if the column data type is ‘DateTime’. Choose a format from the list or enter your own. In addition to datetime-like formats, you can select ‘unix_timestamp’ and ‘unix_timestamp_ms’ in case the REST data source requires that.
- Decimal separator: What to use for the decimal separator. Only displayed if the data type supports it.
- Click Use target-based incremental load to enable a form of incremental load for data sources that don’t support real source-based incremental load. With this option enabled, dynamic values are not used. Instead, all data will be loaded from the data source, but only new data will be inserted into the Ingest storage. For this reason, the performance improvements are usually limited compared to a regular full load, and this functionality is only provided as a backup option for when you cannot load the latest data from the data source.. For that reason, Show advanced features must be enabled for the option to be visible.
- Click OK to save your changes.
Step 2: Setup in the TDI Portal
The changes required in the TDI Portal will highly depend on the data source that is being connected to. Refer to their documentation on how to extract only the data you need. In the example below, the data source requires you to add a query parameter which tells the REST data source to start from a specific date and get only data that is newer.

Enabling incremental load on a task
A transfer task will use incremental load if Use incremental load when available is enabled (and incremental load rules are set up).
