Use Azure Data Factory for Data Movement

Using Azure data factory for data movement is ideal for large data sets. By using Azure Data Factory the data is transferred directly from source to target and therefore does not rely on the limited resources of the application server. 

TimeXtender can use Azure Data Factory (ADF) to move data in the following two ways:

  • Source to ODX
  • ODX to Data Warehouse

The following guide will show you how to: 

  1. Create an Azure Data Factory Resource
  2. Transfer from Source to ODX using Azure Data Factory
  3. Transfer from ODX to Data Warehouse using Azure Data Factory

Create an Azure Data Factory Resource

Note: If you have already created a Data Factory you wish to use, you can skip this step.

  1.  Azure Portal -> Create a new Resource -> Data Factory -> Create 
  2.  Select Version = V2
  3.  Assign Subscription name, Resource Group, and Location
    1. Git is not required so you can disable this.
  4. Once deployed, please note the following properties of the Data Factory which will be needed later:
    1. Azure Data Factory Name
    2. Subscription ID
    3. Resource Group Name

Create an App Registration

In order to access the data factory resources from TimeXtender, you will need to configure an App Registration in the Azure portal. 

  1. In the Azure Portal menu, click on Azure Active Directory, then click on App Registrations in the menu bar on the left. Then click New Registration.       
  2. Enter a name and select Accounts in this organizational directory only. The value of Redirect URI is the URL at which your application is hosted. Click Register when you are done.
  3. For the newly added App Registration, select Certificates & secrets to create a New Client Secret. This key is encrypted after save, so it needs to be documented somewhere safe. The secret will appear after you click Add.
  4. Please note the following properties of the App Registration which will be needed later:
    1. Azure Tenent ID
    2. Application ID
    3. Client Secret (Application Key)

Enable App Registration access to Data Factory

After the App Registration is created, you need to configure access to Data Factory.

Note: The following steps for access control describe the minimum permissions required in most cases. In your deployment/ production, you may fine-tune those permissions to align with your business rules and compliance requirements. Refer to Microsoft Azure documentation for details.

  1. Go back to the resource group where your data factory resource is located and select the Data Factory resource.
  2. In the menu bar on the left, select Access Control (IAM) and add a role assignment.
  3. Add the <App Registration Name> you just created to the role of Data Factory Contributor of the resource.

 Note: When you add or remove role assignments, wait for 5 minutes before executing an ODX task.  It can take up to 30 minutes for changes to take effect.  For more details, review this article Troubleshoot Azure RBAC 


Transfer Data from Source to ODX using Azure Data Factory

  1. Open your ODX server in a tab
  2. Right click Data Sources and click Add Data Source...
  3. On the first page, enter a Name and (optional) Description for your data source and click Next
  4. On the Provider page, select one of the following Azure Data Factory data sources provider and click Next
    • mceclip0.png
  5. On the Connection Info page, enter the below information and click Next
    • Azure Data Factory Info
      • Azure AD App Registration created in the above section
        • Application ID
        • Application Key (Client Secret
      • (Optional)  Azure Data Factory folder name - pipelines and datasets will be placed in this folder in ADF.
      • Azure Data Factory Name
      • Resource Group
      • Subscription ID
      • Azure Tenant ID
    • Database Connection Info
      • Database
      • Username
      • Password
    • Execution Connection
      • Data source:  fully qualified server name of the data source - This is the connection property used by Azure Data Factory when extracting data. 
      • Integration runtime name: This is only required when connecting to On-Premise data sources. If using this option, you must also Configure a Self-Hosted Integration Runtime
    • Synchronization Connection 
      • Data source: (Required) fully qualified server name of the data source
        • This is the connection property used by the ODX Server when synchronizing metadata (Schemas, Table Names, Field Names & Data Types).
        • Note:  Typically, the Execution connection and Synchronization connection data source addresses/names/URI will be the same.  However, the address could be different in the cases where Azure Data Factory and ODX Server will be located outside vs inside a network. The Data Source must be accessible by both ODX Server and Azure Data Factory. 

Continue setting up the ODX data source as usual. 


Transfer from ODX to Data Warehouse using Azure Data Factory

This option is NOT supported when using Azure Synapse SQL Pool (SQL DW)
You MUST configure a separate Azure Data Factory resource for each environment

  1. Go to Tools menu -> Environment Properties -> Settings
  2. Set "Transfer from ODX with Azure Data Factory" = True 
  3. Specify required properties for Data Factory




Connectivity Error:


System.Data.SqlClient.SqlException: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible

Important:  The data source needs to be accessed by both an Azure Data Factory and the ODX.  If you receive an error, it is important to isolate which component is unable to access the data source.

If you make changes to Data Lake or Data Factory configuration Access Control (IAM) and role assignment etc.

  • Wait until the changes take effect in Azure  ( Troubleshoot Azure RBAC )
  • Execute Task and verify it succeeds
  • Preview Tables in Data Factory to verify it is working OK
  • Synchronize objects in ODX (if you made changes in Data Source)
  • Preview Tables in ODX to verify the Data Lake setup is working OK
  • Deploy and Execute in TimeXtender


Take a look at your Azure Data Factory settings, specifically in the Author pane, under the Datasets section, look for the "Destination" dataset created for this data source. Should look something like this: 


Try and click "Test connection" to see if that works. 


Study this detailed troubleshooting guide:

Troubleshoot self-hosted integration runtime in Azure Data Factory

Was this article helpful?
0 out of 0 found this helpful


Please sign in to leave a comment.