I have created a semantic model in our development environment with a global database,
In development the model is named Sales DEV and in production it is named Sales.
When I make changes to the model and do a multiple environment transfer, the data in the production model is deleted and the users cannot use the model before I have executed the model in the production environment.
I expected a creation of a offline model, which could be executed at a later time.
Instead our users cannot access data in a period of time, which is unacceptable to our business.
How can I avoid this, so our users do not experience down time on the production model?
Best answer by Christian HauggaardView original
Did you stop the scheduler service in the destination environment? Doing so should prevent that any jobs start, and the production model remains intact until you are ready to deploy and execute.
Thank you for your answer.
I didn’t stop the scheduler earlier, but I have just tried it and the PROD model is still overwritten,
if I make changes to the DEV model and do a multiple environment transfer :(
I would have expected an offline model to be created, exactly the same way as a SSAS Multidimensional Server works.
Thank you for your comment.
My knowledge of Power BI Pipelines are slim to none and I would like to have the full control of deployment / execution inside TimeXender.
could you show the settings you have for your Development endpoint? Are the server and database names the same for both?
When you use process offline, TX will make a new tabular database and swap them out once processing is done. This should minimize downtime for your users at the cost of extra memory consumption.
Thanks for your answer.
It is the same server, but two different database names.
My problem is not the processing, but the deployment.
When I make changes to my DEV database and do a multiple environment transfer the PROD database is overwritten - then I have to process the PROD database before my users can access the database again.
When I do this on a SSAS database the PROD database is not overwritten before the next proces of the database. Is this is not possible in a semantic model?
It is a bit annoying to have to make the deployment and processing after normal working hours.
I just tested on a local setup I have (Dev & Prod on one VM with SSAS on the same machine). When I do an environment transfer from Dev to Prod, there is no effect on the Prod SSAS tabular database. Only when I deploy the Semantic Model the changes are pushed and require processing, i.e. a Deploy&Execute.
Are you only doing an Environment Transfer or also a Deploy through the Environment Transfer window?
I am doing both an environment transfer and a deployment.
If I do the same on SASS Cubes a new database (tx_Salg) is created and the original database (Salg) is still up and running - no downtime for users.
If I have to execute the SSAS Tabular at once at deployement, the database is down for about 15-20 minutes - can this be avoided?
Sorry if my explaination is a bit blurry :)
I have been trying to get evidence of the offline database which should be named Offline_<endpoint name>. I added some extra data to my demo setup and can see that the offline tabular database is created and processed. Could it be that swapping them out also takes a long time (perhaps because your model is very large)?
No, the Offline_<endpoint name> database is not created when I deploy through multiple environment transfer as I would expect.
Instead the PROD database is truncated:
The Offline_<endpoint name> is only created when I proces the database - therefore the downtime is equal the processing time :(
It looks like the Deploy is always destructive indeed - when using Azure Analysis Services there is also typically no memory to spare to allow for offline processing even for normal reloads. I guess finding a “less bad” timeslot would then be required for less issues on the end-user side.
I am not sure whether the new release's Power BI Premium Capacity interfacing has the same behaviour. It might be worth posting as an idea if no TX people chime in.
Thank you - at least I have not misunderstood it all :)
Which version of TimeXtender are you using?
Are you using SSAS (on-prem tabular model) or an Analysis Services in Azure?
I am using TimeXtender 18.104.22.168 and a on-premises SSAS tabular model.
OK, thanks for providing the info. Please see this video - which shows the current offline processing functionality whereby a temporary tabular model is created upon execution.
The key to reducing downtime is to limit the time between deployment and execution.
You can transfer the project to another environment without affecting the existing tabular model. However once you deploy the tabular model (without executing), then the tabular model will not have data. Therefore it is important to execute the tabular model as soon as possible after deployment.
Can you please try reducing the time between deployment and execution after a promote?
OK - then I need to deploy and transfer the project at the same time (outside office-hours).
The execution time is about 15-20 minutes - and it would be very annoying for our users if the data is not available in this period of time.
I do find it a little odd that you cannot make an offline deployment as with SSAS cubes.
Thank you - I will do that :)