Skip to main content

This is a standard reference architecture to implement TimeXtender Data Integration fully in the cloud with Snowflake for data warehouse storage.

To prepare your TimeXtender Data Integration cloud environment, here are the steps we recommend.

1. Create Application Server - Azure VM

To serve the TimeXtender Data Integration application in Azure, we recommend using an Azure Virtual Machine (VM), sized according to your solution's requirements.

Guide: Create Application Server - Azure VM

Considerations: 

  • Recommended Sizing: DS2_v2 (for moderate workloads). See Azure VM Sizes documentation for more detail.
  • This VM will host the below services, and must remain running for TimeXtender Data Integration to function
    • TimeXtender Ingest Service
    • TimeXtender Execution Service

2. Create Storage for Ingest instance - Azure Data Lake Storage Gen2

ADLS Gen2 is highly performant, economical, scalable, and secure way to store your raw data.

Guide: Create Ingest Storage - Azure Data Lake Storage Gen2

Considerations: 

  • When creating the ADLS Gen2 data lake service, you must enable Hierarchical Namespaces
  • TimeXtender Data Integration writes files in Parquet file format, a highly compressed, columnar storage in the data lake.
  • It is possible for Ingest Server to store data in Azure SQL DB (rather than in a data lake), but this adds cost and complexity but no additional functionality
  • When using Azure Data Lake for the Ingest instance and SQL DB for the Prepare instance, it is highly recommended to use Data Factory to transport this data
  • ADLS will require a service principle, called App Registration in Azure, for TimeXtender Data Integration to access your ADF service. 
    • Both Data Lake and ADF, may share the same App Registration if desired. 

3. Prepare for Ingest and Transport - Azure Data Factory (Optional)

For large data movement tasks, ADF provides amazing performance and ease of use for both ingestion and transfer.

Guide: Prepare for Ingest and Transfer - Azure Data Factory (recommended)

Considerations: 

  • When creating ADF resources use Gen2, which is the current default
  • A single ADF service can be used for both transport and ingestion
    • Ingestion from data source to Ingest Storage
    • Transfer from Ingest instance to Prepare instance
  • The option to use ADF is not available for all data source types, but many options are available.
  • ADF Data sources do not support Ingest Query Tables at this time. 
  • ADF's performance can be quite costly for such incredible fault-tolerant performance
  • ADF will require a service principle, called App Registration in Azure, for TimeXtender Data Integration to access your ADF service. 
    • Both Data Lake and ADF, may share the same App Registration if desired. 

4. Create Storage for Prepare instance - Snowflake Database

  • Snowflake’s multi-cloud compatibility allows users to deploy on AWS or GCP.
  • Automatic usage-based scaling of compute and storage resource provides ideal cost/resource optimization.
  • Suitable for medium to large data solutions. A great choice for mid-size data solutions (from 500GB and up), or in cases where estimated size of data solution is uncertain and/or might rapidly grow.

Guide: Use Snowflake for Prepare Instance Storage

5. Configure Power BI Premium Endpoint (Optional)

If you have Power BI Premium, deploy and execute semantic models within Deliver instances directly to the Power BI Premium endpoint.

Guide: Configure PowerBI Premium Endpoint (Optional)

Hi, 

we have a question. Why is snowflake not being used also as ODX storage? Why is ADLS Gen2 necessary ? 

 

Thanks a lot !


Hi @valeriehocepied 

We currently do not support Snowflake "internal stage". We do however support Azure Data Lake which is one type of “external” Snowflake stage. Snowflake’s external stages are considered a better fit for our product (i.e. ODX). Please feel free to submit an idea in our ideas section regarding this.


Hi @Christian Hauggaard,

Do you have a timeline on when basic features such as lookups, field transformations, etc. will be available in a Snowflake MDW? Also, am I understanding your previous reply correctly that there are no current plans on supporting Snowflake’s “internal stage” in the future?

Thank you!


Hi @pontus.berglund, unfortunately I am not able to provide a timeline for those type of transformations in Snowflake. Regarding Snowflake internal stage, that is correct.


@pontus.berglund lookups and field transformations are now supported in Snowflake DW among other features. For the full list of supported features please see: 

 


Reply