Ask questions about TimeXtender Semantic Models Instances
- 44 Topics
- 199 Replies
Dear Support,TimeXtender version: 188.8.131.52In TimeXtender we have two project variables that we use in a data selection rule in a semantic model.In attachment "project variables settings” you'll see the settings of the two variables.The dynamic variable script that it has to execute is:SELECT CAST(MAX(LoadingDate) AS DATE) FROM FinFact.FacturatieControle to get the highest date. The other script will get the lowest date.In attachment "semantic model with data selection rule” you'll see the data selection rule in the semantic model. We want to filter the date dimension so that it only shows the dates which are in the fact table (FacturatieControle is the fact table).The problem now is that the project variable won't update although we set the resolve type at 'Every time’.Only a deploy and execute of the model will update the date filter. Can you help me with this issue? Best regards,Christian Koeken
Shortly we will be implementing Tableau endpoints with TimeXtender. With respect to RLS, I read the following in the ‘Semantic Model-Level and Row-Level Security’ tutorial:Is it on the roadmap for TX to implement RLS on Tableau endpoints as well?Maybe you can elaborate on the mechanism behind the .tds endpoints, do these need RLS like you can configure in Tableau (i.e. by configuring something on the semantic model in TX which is currently unavailable)? Or do they use a connection to the DWH (SQL db), thus using the permissions on tables related to the users of the dashboards?
Hi all,what’s best practice regarding organizing/reusing measures in SSLs?In my example the customers uses a power bi endpoint and has created a set of measures used by users in power bi. Lets say these are financial measures. A few question regarding best pratice have now popped up:We want to reuse these measures in a different SSL. For example we have SSL for the financial department and one for the board of managers. We don’t want to duplicate the measures, because it makes it hard to keep them up to date. Is there a way to “copy” measures from one SSL to another or to reference measures from a different SSL? We want to “categorize” the measures. So lets say we have financial measures as well as logistics measures. Once we get to a significant amuount of measures, it’s hard to keep track of them. Is there a way to prefix them, add categories of something similar? We had the idea of creating empty “measure” tables in the MDW with a separate “Measure”-Database-Schema. But this also c
Dear Support,My customer is having some troubles with executing a tabular semantic model. This customer has three semantic models, Finance, Sales and Logistics. Finance and Sales are running fine, but Logistics sometimes give the error "The stream does not have an active operation!”.I solve this issue by doing a Full process on the model in SQL Management Studio. But what I want to know is what causes this error? As I said Finance and Sales are running fine, but Logistics gives this error now and then.See the file in the attachment for the error. TimeXtender version: 6346.1
When designing a model, in TX we have to options:Deploy. It deploy all the model to the server without data Execute. Reads the data from the DW and fills the model in the server.After we have the model in the server, we can use external tools like Tabular Editor or Power Bi in the service to add custom measures that it’s not possible in TX, for example calculation groups (see video from @fwagner :Until here, all ok. The problem appears wen you Execute the model again from TX that deletes all the external changes.In think that it’s an issue.
I've made a simple semantic model and configured an Analysis Services (Tabluar) endpoint in Azure. There is one perspecive containing a few fields and measures. The model deploys and executes without any issues.However, the perspective is not visible when connecting to the model from Power BI Desktop. Also, when I connect to the model from Management Studio, I can only browse the whole cube and again the perspective is not there. I can script the AS database to a new query window and the perspective is not mentioned in the script.Is this an issue with TimeXtender? Or as AAS config issue perhaps? I'm using a PPU license but I don't think that's relevant since the perspective cannot be used in Management Studio either.TimeXtender version is 184.108.40.206.Thanks!
Hello, I have built a data model in Timextender and with a Power Bi endpoint. In the Power Bi report I see certain fields summarized by ‘Sum’. In the SSL layer I only get the option to change te name and I am missing the 'Summarize by’ and the ‘Sort by’ option. What am I doing wrong?
Good Day,I have setup a Perspective in TimeXtender (v220.127.116.11) and it includes 3 tables and a Tabular cube. When I go to deploy this perspective from DEV to QA, the tables are listed in the deploy tasks but not the cube.I tried the following fix but this did not resolve the issue:Remove Tabular cube from perspective Save TX Project Re-add Tabular cube to Perspective Save TX Project Deploy TX Project Multi Environment TransferHas anyone ever run into this issue before?I am going to try to add a new perspective to see if this resolves the issue but it worries me for promotes going forward that objects maybe missing on my deployment/promotes.Kerry
Hello, i use Semantic layer as an endpoint to analyze the data through Microsoft Excel. In my model i have Calendar table (auto generated by TX) which dates from 2015 up to 2030. Fact in my dataset has values only up to year 2026.When i open analysis services endpoint through Excel and want to filter values based on Calendar, i can see the empty row in filter. I assume this is due to the reason that Calendar dimension has values up until 2030, and the fact - up until 2026:Is there anyway to not show empty values coming from dimensions (that do not have matches within the fact) in the analysis services from semantic layer? Thanks!
Hi Community,After creating a simple semantic model (one table) and deploying it to Power BI Premium I get an error for the connection to the source database. It says it cannot connect to the source database because of the credentials (we use SQL Authentication on our Azure SQL database in this case).Would be nice if we could configure that from within TimExtender (and perhaps there is a way?).After manually fixing this authentication issue. we are able to execute this endpoint.This is confirmed by checking the Refresh History. Now we try to add the created (and tested) execution package to a job. That is not possible because the execution package is greyed out in the Job and therefore to able to be added.What could be reason why we are not able to add Semantic Model execution packages to a Job?Thanks in advance. Kind regards, Eric
Hi,We have different instances for the development, test and production environments at a customer. Both MDW and SSL.After copying the instances from one environment to another, it would be nice if you could easily switch source for the target SSL to the MDW in the same environment. Under Synchronize with remapping, one must change the instance, data area, schema and table for each table in the model. it would have been smoother if you could switch instances for the entire model with one change.The easiest would have been if you could choose whether these settings in an SSL should be overwritten or not when copying.BRAnders
Dear Support,I'm having a problem with adding a new column from MDW area to the semantic model (tabular).I synced the model, I see the new column in Data Movement on the right side, but when I try to add the new column to the table it belongs to in the semantic model I don't get a 'plus’ symbol. It is a column I want to add to a fact table.I tried to reboot the server, but that didn't solve it. I opened another semantic model and tried to add a new column to a table and that worked. But for some reason in this model it won't. Can you help me?TimeXtender version 6143.1
In this screen:It’s not possible to delete a member. The only way it’s delete the “RLS Setup” and create it again. It’s crazy.. Another crazy situation it’s that if you define a Dynamic RLS Setup and press ok and enter to to see it, it’s lost...
Hi, I have a SSAS tabular model that is deployed to Azure. The model executes fine most of the time and it take around 3 minutes to complete but sometime it takes long time to finish the execution (running for 12 hours but still not completed, so killed the process instead). So, is there any way to failed the execution if takes over 1 hour? I cannot see any timeout setting for SSAS tabular. I am using version 18.104.22.168Thanks.
Hi team, TimeXtender allows adding parameters from a different table to a custom field in a semantic data model (Qlik). The resulting syntax/qlik script combination is always broken.When using adding a custom field parameter from a different table, TimeXtender fully qualifies the Qlik syntax regardless of the settings. The resulting syntax on the Qlik side will no longer match the syntax in the views created by TimeXtender:Qualified setting:Fully qualified setting: The resulting Qlik Script:"Sales_Targets":LOAD"KPI", "Target", "DIM_Boekdatum.DayName" AS "Test";SQL SELECT"KPI", "Target"FROM "Test"."dbo"."Test QVD_SLQV";But the view has the following syntax:CREATE VIEW [dbo].[Test QVD_SLQV]-- Copyright 2011 timeXtender a/s-- All rights reserved---- This code is made available exclusively as an integral part of-- timeXtender. You may not make any other use of it and-- you may not redistribute it without the written permission of-- timeXtender a/s.ASSELECT [KPI] AS [KPI] ,[Target] AS [Targ
We are facing an issue with one of our clients in which we built out a UAT environment to promote proper development but have come across an issue where we have a fully built TX Semantic Layer on the production environment, but we do not have it built out for the TX UAT environment. The reason behind this was because the test Qlik Sense environment was not available initially, and we could not successfully execute packages on the TX uat environment, so we decided to work without it on TX uat. Now that we have an operational test Qlik Sense environment, we want to export the semantic layer from TX production to TX uat environment. Is there a way for us to do this in TimeXtender? We are trying to avoid large amount of manual work to build this out on the TX uat environment and we dont want to export project from Prod to UAT as that will override all our work in UAT environment. Prod TX Semantic LayerUAT TX Semantic Layer
When we creating a numeric field in a MDW table, TX suggets use a numeric(38,6) for the data type.When the field it’s translated to the semantic model it’s detected in TX:But in the model:And one of the best practice rules says “Do not use floating point data types”.How can I follow this rule?
Dear Reader,Dynamic Row Level Security (DRLS) can be implemented by defining a table that contains 1. values for the column to be secured; 2. defining the users email-addresses.With this specific case, two DRLS columns are defined.This is shown as depicted in the first picture below:The first column (_Retailer_Key).The second column to be secured is also dynamically defined (_CountryAccess_Key):This works, however suppose I have 4 retailer country combinationsCountry Retailer UK YourOwn NL YourOwn UK Theirs GE Unsere And the _CountryAccess_Key that was defined by DRLS was “UK” and the Retailer was YourOwn, then the Result is that the rows with the red font is returned. This is just (_CountryAccess_Key OR _Retailer_Key). However that’s not the result required. The result should be (_CountryAccess_Key AND _Retailer_Key) as depicted below.Now the result only returns one row (depicted in the red font) Country Retailer UK YourOwn NL YourOwn UK Theirs GE Unsere
Hi,I have created a semantic model in our development environment with a global database,In development the model is named Sales DEV and in production it is named Sales.When I make changes to the model and do a multiple environment transfer, the data in the production model is deleted and the users cannot use the model before I have executed the model in the production environment.I expected a creation of a offline model, which could be executed at a later time. Instead our users cannot access data in a period of time, which is unacceptable to our business.How can I avoid this, so our users do not experience down time on the production model? Kind regards,Rasmus Høholt
We using multiple semantic layers, and adding more in the future. All semantic layers will probably need a calendar dimension and some other dimensions are also shared between layers. Is there any 10x method to copying between semantic layers, as it seems quite ineffective to setup:sort bysummarize bycategoryn-times on the same table
Hi,The current Data warehouse has tables with two different DB schema's. There are already PowerBI Dash-boards using these schema's. Now I want to use the Semantic layer to copy the data (Data Export) to a SQL serverless DB. Everything works fine but I want to override the schema name. In the Semantic layer I have not find any options to change this. Is that correct or is there a way like in the Dataware house layer to change the schema?
TX: 20.10.39SQL Server: 2019Endpoint settings:Windows authentication hostname matching case https connection over the default proxy I am trying to push a Qlik Sense Enterprise endpoint from TimeXtender and having trouble getting the deploy successfully running. Testing connection to Qlik Sense is successful, when I Deploy I get an error when TX tries to create a new Data Connection. I see this issue pop up sometimes in other deployments and can usually overcome by manually making a Data Connection to the MDW database and giving it the same name TX would use: MDW_SLQV . In this case, I get the same error when TX tries to modify the connection.TimeXtender does create the app (and is owner of it) and can successfully reload the app, it just doesn't insert the script. If I copy-paste the TX Qlik Script into the app it works fine. As neither TX nor Qlik actually log what is going on, I cannot see what about the connection string is causing the error.
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.