Skip to main content

Fabric Lakehouse Ingest Instance storage (Public Preview)

  • April 2, 2024
  • 7 replies
  • 1065 views
Fabric Lakehouse Ingest Instance storage (Public Preview)
Thomas Lind
Community Manager
Forum|alt.badge.img+5

Use Microsoft Fabric Lakehouse as Ingest Instance storage, and ingest data sources into Microsoft Fabric OneLake Delta Paquet format.

Fabric Ingest instance storage is available as part of the Standard, Premium or Enterprise Package

A public preview of this feature is currently available. This public preview does not currently support:

  • TimeXtender SAP Table data source
  • Ingest Storage Management tasks
  • Ingest Security Roles
  • Transfer to Prepare instance with Snowflake storage
  • ADF Transfer to Prepare instance

Prerequisites

  1. Create an App Registration in the Azure Portal - It is recommended to use a dedicated app registration to ensure this account is the only one with access to the client credentials 
  2. Add a user that does not require multi-factor authentication (i.e. a non-MFA account) as Owner of the App Registration, in order to allow for unattended re-authentication
  3. Add a platform and select Mobile and desktop application and enter https://localhost as the custom redirect URI and click Configure and then Save

    This will result in the following Redirect URIs being added automatically
  4. Navigate to Authentication settings for the App Registration. Set Allow public client flows to Yes 
  5. In Fabric/Power BI Admin Portal, enable “allow service principals to use Power BI APIs” as described here, in order to grant the app registration access to the Fabric workspace.
  6. Create a workspace, or navigate to an existing workspace, in the Fabric portal and select Manage access. Grant the App Registration and the non-MFA account Member access to the Fabric workspace.
  7. Runtime version of the Fabric workspace needs to be set to 1.2. Navigate to your Fabric workspace and click Workspace settings, and under the Data Engineering/Science, click Spark settings and select Runtime version 1.2 and click Save.
  8. Enable the following setting for “users to access data stored in Onelake with apps external to Fabric”

     

Configure Ingest Instance for Fabric Lakehouse Storage

  1. Add a new Ingest instance and select the storage type Microsoft Fabric Storage
  2. Enter the workspace name for the existing Fabric workspace
  3. Provide a name for the Lakehouse

    Note: You can connect to an existing Lakehouse that has been created directly in the Fabric Portal, or you can choose to create the Lakehouse within TimeXtender Data Integration (TDI).

     
  4. Enter the user name and password for the non-MFA user that was setup as an owner in the App Registration

     

  5. Enter the Tenant id for the tenant associated with Fabric
  6. Enter the Application id for the App Registration
  7. Enter the Application Key (i.e. the client secret value) associated with the App Registration

Create the Fabric Lakehouse Ingest Instance Storage

  1. Open the TimeXtender Data Integration (TDI) application and open the Ingest instance
  2. Right-click on the Ingest Instance in the Solution Explorer tab and select Authenticate, login with the non-MFA user and accept the permissions it wants to add in order to give your App Registration the correct Scopes
  3. If you haven't created the Lakehouse already, you can do it now by right-clicking on the Ingest Instance in the Solution Explorer and selecting Edit Instance, and then Create Storage 

Objects created in Fabric

A notebook is created and run when a transfer task is executed for a data source in a Fabric Ingest instance. You can see these notebooks being created and run in within the Fabric portal under Monitor. The notebook creates a delta parquet table in the lakehouse (with the naming LoadTables_<data source name>_<uuid>), and creates a temporary parquet file in the lakehouse file folder. The data from the temporary parquet file is transferred to the delta parquet table in the lakehouse, and then the temporary parquet file is deleted. JSON files are also created in the lakehouse file folder to capture the data source metadata.

 

Did this topic help you find an answer to your question?

7 replies

Forum|alt.badge.img+1

Is there any workaround for tenants who require MFA?  We have quite a few who absolutely will not create a user without MFA.


rory.smith
TimeXtender Xpert
Forum|alt.badge.img+7
  • TimeXtender Xpert
  • 649 replies
  • January 7, 2025

Hi ​@david.zebrowitz ,

As far as I am aware Microsoft has not made that possible yet.


Trine Stuhr
Employee
Forum|alt.badge.img
  • Employee
  • 14 replies
  • January 31, 2025

And if you haven’t already, please give the Idea to remove the non-MFA requirement a vote in here: Microsoft Idea 😊


rory.smith
TimeXtender Xpert
Forum|alt.badge.img+7
  • TimeXtender Xpert
  • 649 replies
  • January 31, 2025

According to Microsoft this has already shipped: https://learn.microsoft.com/en-gb/fabric/release-plan/shared-experiences#fabric-core-rest-apis-support-service-principal it probably needs a more specific feature request. I will certainly be pushing for this at Fabcon '25.


Trine Stuhr
Employee
Forum|alt.badge.img
  • Employee
  • 14 replies
  • January 31, 2025

It is still not available in the API to execute notebooks with service principals (we can only create/deploy) Job Scheduler - Run On Demand Item Job - REST API (Core) | Microsoft Learn


mello.bogaarts
Starter
Forum|alt.badge.img

We where unable to set this up and had to enable this setting in Power BI/Fabric management portal:

 


Christian Hauggaard
Community Manager
Forum|alt.badge.img+5

@mello.bogaarts thanks for bringing this up, I have added this setting to the guide now


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings