Skip to main content

It would be really beneficial if functionality was added to the Semantic layer, such that SSAS Tabular partitions could be defined within the tool.

Ideally, one would be able to right click on a table within the semantic model and choose partitioning, which would allow you to pick a field, like a Date field and configure it so that partitions on say Year, or Quarter, or Month, or Week would be dynamically created with each deployment of the tabular model.


Additionally, it would also be great if once TX had visibility of the metadata of the partitions it has created, it would be extremely beneficial if we could then decide which partitions are executed in a given execution package.

A little background for this request, we are in the middle of a major AX and data warehouse project, and we are building out quite a few SSAS tabular models to be consumed by Power BI.  Unfortunately, with the current setup of the semantic layer, we are having to consider managing tabular models outside of TimeXtender, as we don't have a good way to dynamically build partitions or to process only the most recent partitions, once we create them outside of TX.  (It's not really an viable option to full load a tabular model with billions of rows.)  The functionality described above would eliminate the need for us to manage SSAS tabular outside of TX.

While we understand this might take some extensive work in the semantic layer, a fairly quick workaround that would help us greatly would be to add script actions on tables/endpoints within the semantic layer.  It seems like every where else in the tool, we can add a post deployment or pre/post execution script actions, but that functionality seems to be lacking in semantic layer for Tabular.  Having the ability to at least create partitions via script action post deployment of the endpoint would be a step in the right direction.

I think that the ability to execute only certain partitions is going to be really vital to larger deployments where full execution of a tabular model could be prohibitively slow.  +1


This is greatly needed. I wish i could vote 1 million times. We have to do this outside of the DH tool.


 


 


This should be a good addition to the tool so we can batch load data in the semantic end points


This idea has been submitted here

 

In the meantime, a potential workaround is to implement partitioning via a PowerShell script, the ability to execute PowerShell scripts was released in version 6024.1. 

 


Reply