Because Time Matters
New to TimeXtender? Get Started now!
Start discussions, ask questions, get answers
Read product guides and how-to's
Submit ideas and suggestions to our team
Read the latest news from our product team
Online training and certification
Connect with like-minded professionals
Explore the new cloud-enabled version of TimeXtender.
Hi! I am using TimeXtender version 6436.1 and TimeXtender Dynamics 365 Business Central - Online version 18.0.0. When reading data from Business Central in to TimeXtender I get the error “Sequence contains no matching element” on certain tables. This is what the error looks like on the Sales Header table for example: Executing table [Sales Header]:failed with error:System.InvalidOperationException: Sequence contains no matching element at System.Linq.Enumerable.First[TSource](IEnumerable`1 source, Func`2 predicate) at TimeXtender.ODX.Account.BC365CustomDataSource.ExecutionHelper.GetIncrementalLoadPartitionSetups(AccountBC365CustomDataSource accountBC365CustomDataSource, TableModel tableModel) at TimeXtender.ODX.Account.BC365CustomDataSource.AccountBC365CustomDataSource.GetIncrementalLoadSetups(TableModel tableModel) at DataSourceEngine.Custom.CustomSourceEngine.GetIncrementalLoadSetups(TableModel tableModel) at DataStorageEngine.SQL.SQLStorageEngine.<>c__DisplayClass46
Hi all,I recently created a bunch of SQL logins in TimeXtender to use in our Database Roles we set. After I did that I found out I made a mistake and I removed those SQL logins from the database. However, when I try to edit a database role and try to Add a login, I still see the SQL logins I removed.To make sure there is nog caching or something involved I waited a few days before checking if they were still there.Does someone have a solution to remove those logins from the list? I don't want to create new names for my SQL logins just to avoid having the same name as the previous one. Although, if that is the only option I am willing to accept that as well. Kind regards,Carlo
I am trying to set up access to some tables in the ODX. when I try to add a role and search for an Azure AD user I get the error: Service request failed: Code: Authorization_RequestDenied ...Module: TimeXtender.ODX.EngineTimeXtender.ODX.Engine.ODXFaultException at TimeXtender.ODX.Engine.ODXEngine.SendServiceRequest[C,T](WcfServerSettings serverSettings, Func`3 func) at TimeXtender.DataManager.AddODXSecurityRoleWizard_MemberSelectStep.<>c__DisplayClass11_0.<SearchClicked>b__0() at TimeXtender.DataManager.ConnectingThread.ExecuteConnectingThread(Object dummy)Service request failed: Code: Authorization_RequestDenied ...Module: timeXtenderTXModelInterface.ExceptionWrapperException at TimeXtender.DataManager.ConnectingThread.HandleError() at TimeXtender.DataManager.ConnectingThread.Execute(String title, Int32 progressSteps, List`1 actions) at TimeXtender.DataManager.AddODXSecurityRoleWizard_MemberSelectStep.SearchClicked(Object sender, EventArgs e) This setup is a V
Hi all,what’s best practice regarding organizing/reusing measures in SSLs?In my example the customers uses a power bi endpoint and has created a set of measures used by users in power bi. Lets say these are financial measures. A few question regarding best pratice have now popped up:We want to reuse these measures in a different SSL. For example we have SSL for the financial department and one for the board of managers. We don’t want to duplicate the measures, because it makes it hard to keep them up to date. Is there a way to “copy” measures from one SSL to another or to reference measures from a different SSL? We want to “categorize” the measures. So lets say we have financial measures as well as logistics measures. Once we get to a significant amuount of measures, it’s hard to keep track of them. Is there a way to prefix them, add categories of something similar? We had the idea of creating empty “measure” tables in the MDW with a separate “Measure”-Database-Schema. But this also c
This is a CData data source, more on what this is in the following link. Add a CData data sourceCSV Information Setup of the provider Connecting to one file Connecting to more than one file Connect to more than one file across sub-directories Connect to CSV files on a FTP/SFTP server Connect to files on Azure Storage Connect using Storage Account and Access key Data Lake Gen 2 ABFSS AzureBlob AzureFile File Share Troubleshooting The file is not using the default delimiter settings Aggregate multiple files into one Select and aggregate files with similar names across folders Controlling the names of the fields Troubleshooting data types Dates are shown in the wrong timezone Convert the character set The files were made with different culture settings CSV InformationHere is a link to the file I use in this examplehttp://mysafeinfo.com/api/data?list=moviesbyearnings2005&format=csvIt is set up with elements like this.Year,Rank,Title,Gross $,Month Opened,Theatres2005,1,St
This article clarifies two different methods of adding a SharePoint site as a data source in the TimeXtender Portal.Method 1: Basic AuthenticationThis method specifies a user account that does not have MFA two factor authentication enabled and has been granted access to the SharePoint site. The configuration in the TimeXtender Portal for the SharePoint data source is as follows:Provider: Microsoft SharePoint Auth Scheme: Basic Share Point Edition: SharePoint Online URL: <SharePoint Site URL> i.e. https://contoso.sharepoint.com/sites/PBI User: <username> of a user that has been granted access to the site. Password: <password> for the specified user.Method 2: OAuthThis method uses an Azure App Registration and OAuth to connect to the specified SharePoint site. In the TimeXtender Portal, create a new data source and set the provider to be “Microsoft SharePoint”. Auth Scheme: “SharePointOAuth”. Share Point Edition: SharePoint Online URL: <SharePoint Site URL> i.e. h
This article clarifies the concepts of role-playing dimensions and explains how to implement them in TimeXtender. What is a Role-Playing Dimension? Example Using Semantic Models Instructions Video TutorialWhat is a Role-Playing Dimension?Role playing dimensions are a useful approach that can streamline your data warehouse by reusing the same dimension table in a different context depending on how it relates to the fact table. When implemented properly, role-playing dimensions create consistency, improve query performance, simplify maintenance, and ultimately enhances the depth and breadth of business insights derived from your data warehouse.ExampleTo illustrate this concept, consider the following two tables: Fact Table: SalesTransactionsOrderID ProductID Quantity OrderDate🗝️ ShipDate🗝️ 001 101 2 2023-10-15 2023-10-16 002 103 1 2023-10-16 2023-10-17 003 108 5 2023-10-17 2023-10-17 Dimension Table: DateDateValue🗝️ Day Month Year 2023-10-1
Hi,Has anyone used a Power BI dataset as a data source? I have tried using the CData SSAS connector but can not get it to work. Any advice would be appreciated.Best,//Pontus
Exciting News!We are delighted to share two major updates with you:TimeXtender Version 6429.1 Release: We're thrilled to announce the release of TimeXtender version 6429.1! This significant update is designed to enhance data integration and management. Explore the details here to learn more about the improvements. New Features for Snowflake Workflows: In addition to the overall release, we've introduced exciting features tailored for users leveraging Snowflake for data warehousing. Check out the specifics here, focusing on both efficiency and functionality to accelerate data workflows within the Snowflake environment.As TimeXtender continues to pioneer advancements in data management, our latest release, TimeXtender 6429.1, and the seamless integration with Snowflake underscore our unwavering dedication to innovation.We invite you to delve into the enhanced features, designed with precision for optimal efficiency, agility, and expanded capabilities. Your exploration into this transform
Hi,I am trying to connect to a REST API that uses OAuth2 authentication.The API documentation is very clear on what to include in the authentication: I have entered all this information in the REST CData provider and I can successfully use the provider’s “Authorize OAuth” button.However, when I try to test the connection, error “401 - Unauthorizated” is returned.In the API documentation, they state that the token needs a prefix “Bearer “.I suspect my error is caused by the REST CData provider sending the authorization header without the Bearer prefix, like so:Authorization: {ACCESS_TOKEN}Which the API of course does not accept. Is there any way to include a prefix to the token in the authorization header?
hi, Lately were are getting alot of 503 exceptions. Error message is typically: The task '983693' failed:System.AggregateException: One or more errors occurred. ---> System.AggregateException: One or more errors occurred. ---> System.Net.Http.HttpRequestException: Response status code does not indicate success: 503 (The server is busy.). at System.Net.Http.HttpResponseMessage.EnsureSuccessStatusCode() at Azure.Management.REST.RestWebRequestManager.<SendRequest>d__6.MoveNext()--- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Azure.Management.REST.RestWebRequestManager.<SendRequest>d__5.MoveNext()--- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerSe
We have data sources where we apply dynamic incremental rules, so if column == “Name” then apply incremental load with 3 day offset. Now there exist some tables in the data source that are currently empty (in the actual source). They do have the incr column and therefore the offset rule. Now we noticed that the ODX transfer on these tables works fine.However when we execute a table with such a table in the mapping we obtain this error: System.AggregateException: One or more errors occurred. ---> System.AggregateException: One or more errors occurred. ---> System.InvalidOperationException: Sequence contains no elements at System.Linq.Enumerable.Max(IEnumerable`1 source) at DataStorageEngine.DataLakeGen2.DataLakeGen2DiscoveryHubExecution.<ProvideDataTableForDiscoveryHubSQLTransferAsync>d__14.MoveNext() --- End of inner exception stack trace --- at System.Threading.Tasks.Task.WaitAll(Task[] tasks, Int32 millisecondsTimeout, CancellationToken cancellationToken) at Dat
Hi,Has anyone worked with API from Energinet?I need some help extracing data from Energinet. Energinet delivers data with multiple endpoints. It starts with https://www.energinet.net/api/unit which gives me this [ { "unit_id": "1234ABCDfolder", "name": "My Company", "links": { "info": { "verb": "GET", "href": "/api/unitinfo/1234ABCDfolder" }, "drilldown": { "verb": "GET", "href": "/api/unit/1234ABCDfolder" } }, "datasources": [] }]Now, i have to drill down to get some info: https://www.energinet.net/api/unit/1234ACDfolder[ { "unit_id": 1Afolder", "name": "Company1", "links": { "info": { "verb": "GET", "href": "/api/unitinfo/1Afolder" }, "drilldown": { "verb": "GET", "href": "/api/unit/1Afolder" } }, "dat
Dear Support,My customer is having some troubles with executing a tabular semantic model. This customer has three semantic models, Finance, Sales and Logistics. Finance and Sales are running fine, but Logistics sometimes give the error "The stream does not have an active operation!”.I solve this issue by doing a Full process on the model in SQL Management Studio. But what I want to know is what causes this error? As I said Finance and Sales are running fine, but Logistics gives this error now and then.See the file in the attachment for the error. TimeXtender version: 6346.1
Learn about troubleshooting techniques
Find a Partner that fits your needs!
Submit a ticket to our Support Team
Already have an account? Login
No account yet? Create an account
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.