Skip to main content

Hi,

We are currently migrating cubes from Power BI report Server ( On-premises) to Power Bi premium.
Those cubes are not maintained in TX & refreshes are triggered through a SSIS package called by TX at the end of the ETL execution process.

In the new implementation (where the tabular models are still not maintained in TX), we will be using XMLA endpoints & would like to trigger partitions refresh from within TX without using a SSIS package.

What would be the best approach to tackle this, taking into account we’re running on an Azure platform (so TX VM and Azure SQL DB for DWH) ?

We were thinking of using an ADF pipeline or use REST api calls, but this isn’t supported by TX version 20.10.x.  

Any ideas on possible alternatives ?

 

Thanks

Peter

Hi Peter,

This guide describes a technique using Azure Logic Apps, to trigger refresh on an Azure Analysis Services (AAS) model.  Explore to see if a similar technique is applicable in your scenario.

Trigger refresh for Azure Analysis Services Models built outside TimeXtender

 


@peter.jensen another alternative could perhaps be to execute a PowerShell script. The attached document shows an example of how a notification can be setup within TimeXtender, which creates a notification in the event log upon package execution, which then executes a PowerShell script through a cmd file and a task in the task scheduler. The document describes how to run multiple execution packages sequentially on failure, but might be helpful in achieving the use case you are describing as well.


All,
rephrasing this a little bit :
If we can run an ADF pipeline, we can solve this. 

So, how can we call an ADF pipeline from within TimeXtender (version 20.10.x running on Azure VM using Azure SQL DB - not a Managed Instance !! - so no xp_cmdshell, no CLR, ...) ?

Thanks,

Peter
 


Perhaps you could set up a REST datasource and call the ADF REST api for a pipeline you prepare: https://learn.microsoft.com/en-us/rest/api/datafactory/pipelines/create-run?tabs=HTTP

We have done similar things with Qlik Cloud to trigger reloads from a CData REST call


I’ve done this by executing a PowerShell script that uses a Invoke-ProcessASDatabase function to process the PBI dataset. To start it from TX 20.x I created a dummy table, attached a post-script that executes a stored procedure that starts a pre-configured SQL Agent Job that runs the PowerShell script. Not optimal but it has been working so far. Also, it’s not just a trigger to start the process but also waits until it finishes when you execute the dummy table. That way, it allows for one PBI dataset to finish it’s process before the next one is started.


Hi @andri,
thanks for your idea.  However we’re in an Azure environment so we don’t have a SQL Agent to run the Powershell script.  We are indeed also looking for a solution where the process waits until model processing is finished.
 


We have done something similar where a webhook was triggered after the model processing had finished, that in turn started a runbook. In this case the runbook performed some additional steps in Azure Analysis Services that had not been implemented in TimeXtender yet.
I don't remember exactly what mechanism was used to call the webhook, but maybe this method is something you could investigate.


An alternative may be to set a Post Execution on the execution package. The post execution will read table ExecutionPackageLog from the TX repository, filtered on the id of the previous package.

With an Azure Function Script or Power Automate flow you can check for new records in this table, and that will be your trigger towards Power BI. In our setup we also created field lPowerBiDatasetStatus] which will be set to True when the new record is processed by the script.

To ensure that values of oPowerBiDatasetStatus] are preserved, we enabled history on this table. Your setup would look something like this:

 


Hi @peter.jensen did any of the ideas above fulfill the use case you are trying to achieve? If so please help us by marking the best answer


Hi @peter.jensen just following up - did you manage to try out any of the ideas above?


Reply