Ask general questions
Recently active
We are currently using TX to stage and merge multiple source systems to create a master list of items, customers, vendors, Open AR. Open AP, etc. Currently, we are using the Data Export on the Semantic Layer to dump out entity data so that it can then be read in by D365 as delimited text files. I am wondering if anybody knows if there is a way to direct connect to D365 so that TX serves as a source to a D365 system? Thanks, Greg Laymon
Due to cost of servers, it would be beneficial to have both a developer and a production on the same server. Is this possible right now? The manual doesn't explicitly mention this. If it is possible to run both environments on the same server, do they need different SQL server instances or can they use the same instance?
Is it possible to execute a package from the command line of TX Remote Control?
I am trying to replicate some code from SQL, and need to create a custom table insert... my SQL query uses a temp table #ELL. When I try to deploy & execute, it does not understand the temp table. Can temp tables be used here? IE. #ELL.. I just need to have it hold some info that later in the query will populate the new custom table in TimeXtender.
I know you can lock a project from editing, but is there a way to password protect a project we install on a client's server so they can't view how we built it? We've spent hundreds of hours building an ERP integration that we want to sell multiple times and don't want a client or competitor to simply rebuild it by exporting, or just looking at the project design and mimicking it.
Thanks in advance for any help.We had a dev deploy but not execute a table before he left for the day. Someone else loaded the table and when he closed the project, no one else could get in. Imported a day-old backup, was able to reload all of the tables. Started working on creating some new dimension and when tried to sync to the source system (Epic Clarity) SQL Server database, got the above error. Full message below: Object reference not set to an instance of an object. Module: timeXtender System.NullReferenceException at TimeXtender.DataManager.BulkedInformationSchema.ReadSchema(Guid datasourceId, String overrideSql, Boolean deleteExistingRecords) at TimeXtender.DataManager.DataSource.BulkTransferADO(Boolean readForeignKeys) at TimeXtender.DataManager.DataSource.RefreshInformationSchema() at TimeXtender.DataManager.ReadDataSourceObjectsCommand.<>c__DisplayClass6_0.<ExecuteCommand>b__0() at TimeXtender.DataManager.ConnectingThread.ExecuteConnectingThread(O
Hi , We have a need to identify the objects modified in last few days to deploy as part of differential deployment to various environments. To do this, we were looking at DiscoveryHub metadata database, which contains various tables like DataTables, DataColumns, Transformations etc. However, to identify the delta, we have two columns ValidFrom and ValidTo which are stored as integers. We need a way to identify the data, either by converting these to dates or based on a range of dates. Please help
I have a table and I need to have a counter that restarts based on another field on the table. In my case I have a JournalBatchNumber field. For each new batch, I need to generate a LineNumber restarting with 1 for each record of the same batch. Is this available in TX natively, or do I need to write a post processing script using rownumber (). Thanks,Greg
Hi Team, Is it posible from TX Oracle connector to export data to a materialized view in Oracle instead of a table ?
Hi , our DH setup consists of multi-layered datawarehouse whereas BI client (tableau ) only interfaces the upper tier: MDW DWH.Having semantic layer for each dashboards (we have quite a few), we would like to move out of the single core TX project with multiple semantic layers to a project for each dashboard (semantic layer). By doing so, we can make sure that each developer works on his own project without "stepping on each other toes" (I'm aware of collabaration features of TX) and keeping our core project unharmed. So, the challenge is how to reuse objects created in the core project (preferably VIEW to reduce space/redundencies) in entire new project (same server/database)My approach was creating a new project, add MDW datawarehouse and then utilize External SQL Connection to connect to the same DWH (transfer type connection). I've also created a seperate database schema with valid behaivor. The problem we ran in was that designated table was the same as similar table in the daatab
When setting up the global database for the DWH, I get this error:"Databricks direct write requires SQL Server Authentication"However, my environment properties are straightforward and I can't find anything "databricks" related in the settings.It's a recent (2019) version.Anyone knows what this error could allude to?
what is the right scenario for using single quotes to wrap project variables? Let's say i have a data constraint added to a field and i am using a dynamic project variable like, soin this scenario, i am comparing a data to another date, and i have the variable encased in single quotes.I always assumed the single quotes was how Discovery Hub "renders" the underlying variable, but i recently found that my assumption was untrue. is the single quotes only to be used when evaluating a character string? or are there some rules to apply wrapping in single quotes. Note: the variable above is a dynamic -> select max({datefield}) from {table} statement, under the covers.
We are providing clients with a preconfigured data warehouse for an ERP system. We will continue to make enhancements and improvements.The client will eventually add additional fields and user defined fields for their own needs.What's the easiest way to merge in our improvements to the changes they have already made to an older version?
Hi,to configure the multiple environment setup I used the PDF (https://legacysupport.timextender.com/hc/en-us/articles/360001136526-How-to-configure-Multiple-Environments) , and everything works well when all the firewalls are turned off.We have opened port 10001 and 5001, but when the firewall is turned on, my DEV environment can't connect with the Production environment.My setup now is a Development server and a Production server.When I want to connect from Development to Production, I see the servers negotiating on port 10001 and 5001. But after that, they continue to work with a port in the 64xxx range (examples: 64135,64177,64281 etc) e.g. a random port.My firewall won't except this, and I would like to know if it is possible to work with a standard range / port so we can configure our firewall accordingly.Kind regards,Rogier
Hi, In my scenario, I rely on a stored-procedure in the source (external) database to create the table which will be used in TX. The stored-procedure might have various table/view/functions dependencies. Is it possible to import stored procedure/scalar functions into ODX layer? I know it's possible to recreate (internal) stored-procedure, however I have no access to the underline script in the stored-procedureKind regards.Dror
HiI know it is quite silly but I am not able to be figure out how to convert a measure which is in 4.97 into whole number as 5?
Just after some advice about whether something is possible.I have developed a data-warehouse that encompasses all products, However due to the nature of the business what I want to do is create two separate cubes, both that only report on a selection of the products, i am trying to avoid extra security roles in Targit which is used to browse the cubes or creating extra SQL tables which i believe would enable me to achieve the result but with more to manage.Is it possible to add a filter to a cube so that it only includes some dimension members?
Hi,We are trying to read a csv file and truncating some columns because our table would be too large otherwise. Unfortunately deploy and execute fails with this error:"Data conversion failed. The data conversion for column "name" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page."."We are pretty sure it isn't an encoding problem because when we increase the column size ("Length") in "Edit multiple text files"->"Columns" we can read the data just fine so the error must occur because of the truncation.Does anybody know what options we have to use to let TimeXtender know that we are ok with the truncation of the data in the columns?Thanks,David
Hello fellow TimeXtender users, We were wondering: is it worth upgrading from SQL 2014 to 2016 in the scenario where we use TX traditionally: stage - dw - cubes. Has anyone done it and have you seen performance improvements? Thanks! Wim CALM - Co
Already have an account? Login
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.