TimeXtender Desktop Q&A
Ask questions and find answers about the TimeXtender Desktop Application
Hi, I am trying to find out how i can schedule deployment from development server to integration server at a particular time on everyday. So, whatever changes I am making in the development will be deployed automatically to integration server. Please let me know in case any one has idea regarding the same. For clarification please ask.
We started to use the configuration continue without data in case a transfer fails. The feature works fine, and we also see the message that one or more datasources fail. My question is, how can we easily see which datasources failed, and with what error message?
I get the followinf error after transferring the latest version van OT to A environment and full managed deployment: Finished executing project 'RobidusDWH' Execution Package 'Laden ACC' Execution failed Start Time : 4/3/2020 7:16:52 AM End Time : 4/3/2020 7:25:01 AM on server: WIN6682 -Execute Execution Package Laden ACC 'Failed' Invalid object name 'HDA_V_DMA.DM_HDA_HDA_V_DMA_Feit_OpenstaandeTaken_DMA_DMA_Feit_OpenstaandeTaken_T'. Details: SQL Server: 'cf15a5af617c.tr6225.westeurope1-a.worker.database.windows.net,11009' SQL Procedure: 'DMA_TX.usp_CrDM_DMA_TX_Feit_OpenstaandeTaken_HDA_HDA_V_DMA_Feit_OpenstaandeTaken' SQL Line Number: 12 SQL Error Number: 208 Invalid object name 'HDA_V_DMA.DM_HDA_HDA_V_DMA_Feit_OpenstaandeTaken_DMA_DMA_Feit_OpenstaandeTaken_T'. Module: .Net SqlClient Data Provider System.Data
Is there a way to use IS NOT NULL as one of the join conditions on a lookup? I want to get the TOP (1) record where a field is not null, ordered by date. Currently I have to edit the _Clean stored procedure as custom code to add it, which is obviously not the best way to do things. Thanks
Hi, We use TX as a part of our value chain of systems using data. We need to programatically be able to check if the execution is done, so the following jobs can start.I know TX can send an email when completed, but we really do not want to build the check on emails. Does anyone have a good idea on how to do this? Hope you can help. Best regards Michael Vaisgaard
Hello,So I am loading historical data into tables in TX created of Excel files. One file for property data, one file for casualty data. After some time the file format changes and they combine these two separate files into one file and add new fields.Can I create a new data source to load the new file but still use the original worksheet tab names such that the data will still load into the existing tables?Thanks in advance.P.S. I am using Discovery Hub 18.104.22.168
Hello, Which settings are needed in order to get Google Mail send success and failure emails in TimeXtender? I tried it with Servername: smtp.gmail.com Port: 25/465/587 Enable SSL/TLS: Checked Allow invalid certificates: Checked From Email: firstname.lastname@example.org User name: email@example.com Password: <my gmail password> To Email: firstname.lastname@example.org I get different failure responses: 25: 5.7.0 Authentification required 465: Syntax error, command not recognized 587: 5.7.0 Authentification required Without SSL/TLS I get: A secure connection is required for the smtp server or the client is not authentificated. Server response was: 5.7.0 Must issue a STARTTLS command first. The responses are translated from german to english so please don't take them literally. What are the corrected settings here? Is it something in my google account?
We are trying to get a csv file of 10 gb into TimeXtender but it takes very long for TimeXtender to get this loaded and sometimes we are getting an error (system out of memory). Is there a solution or something else we can try to get big csv files into TimeXtener or can we change the CSV file to get it loaded into TimeXtender? We are now using the Cdata adapter and already used the multiple/single file adapter. Thanks for the response!
Hi I am trying to connect to Zendesk using the Cdata connector.I am getting the following error. Does anyone have this working. 'Initiate OAuth' does not have a 'REFRESH' option it only has NONE and GETANDREFRESHDetails:When using OAuth the 'Initiate OAuth' setting must be set to 'REFRESH'Module: timeXtenderSystem.Exceptionat TimeXtender.DataManager.DataSource_CData.CreateConnection()at TimeXtender.DataManager.DataSource_CData.TestConnection()at TimeXtender.DataManager.ConnectingThread.ExecuteConnectingThread(Object dummy)When using OAuth the 'Initiate OAuth' setting must be set to 'REFRESH'Module: timeXtenderTimeXtender.DataManager.ExceptionWrapperExceptionat TimeXtender.DataManager.ConnectingThread.HandleError()at TimeXtender.DataManager.ConnectingThread.Execute(String title, Int32 progressSteps, List`1 actions)at TimeXtender.DataManager.ConnectingThread.ExecuteFastAction(String title, Action action, IWin32Window parentForm, CancelBehaviors cancelBehavior, ErrorBehaviors errorBehavi
Hi I am trying retrieve data through an API using XMLI am getting the following error. The API requires I use a wildcard to retrieve all values. My XML works if I use Postman / Swagger to test.Error obtaining value for column 'ProductList.Product.ProductCode': Error parsing long value [ * ].Details:Error parsing long value [ * ].Module: System.Data.CData.XMLdvo200m.hqat dvo200m.Uw.qM(String )at dvo200m.Uw.u(String , Int32 , Int32 , Boolean )at dvo200m.fZ.Na(String , Int32 , Int32 )at dvo200m.ZG.H(Int32 , Int32 , Int32 )Error obtaining value for column 'ProductList.Product.ProductCode': Error parsing long value [ * ].Module: System.Data.CData.XMLdvo200m.hqat dvo200m.ZG.H(Int32 , Int32 , Int32 )at dvo200m.ZG.GetValue(Int32 , Int32 )at CData.Sql.ResultSetBase.GetValue(Int32 colIndex, Int32 dataType)at dvo200m.Zv.L(Int32 , Int32 , Int32 )Error obtaining value for column 'ProductList.Product.ProductCode': Error parsing long value [ * ].Module: System.Data.CData.XMLdvo200m.hqat dvo200m.Zv.L
I have enabled simple mode on a project with 1 data source and 1 Business Unit. I have also updated the settings for all tables to support simple mode:When i try to deploy the project i get the error:Could not load file or assembly 'Microsoft.SqlServer.BatchParser.dll' or one of its dependencies. The specified module could not be found.Location: Business Unit 'ABC'.Details:Could not load file or assembly 'Microsoft.SqlServer.BatchParser.dll' or one of its dependencies. The specified module could not be found.Module: mscorlibSystem.IO.FileNotFoundExceptionat System.Reflection.RuntimeAssembly.GetType(RuntimeAssembly assembly, String name, Boolean throwOnError, Boolean ignoreCase, ObjectHandleOnStack type)at System.Reflection.RuntimeAssembly.GetType(String name, Boolean throwOnError, Boolean ignoreCase)at Microsoft.SqlServer.Management.Common.ServerConnection.GetStatements(String query, ExecutionTypes executionType, Int32& statementsToReverse)at Microsoft.SqlServer.Management.Common.S
Hi!Can you clearify that if I run a full load ( on a source based incremental table), it doesn't update the _I table with the current max value? In our scenario, I run a full load very early in the morning and then run incremental loads during the day. But I have seen that the first incremental load of the day takes a very long time to run which I suspect is due to there being a lot of changes since the last incremental load, i.e. the full load's transfer is ignored by the incremental load?So the first incremental load, more or less, transfers the same data that the full load did?Version 22.214.171.124RegardsNiclas
Hi, I have a large FACT-tables in my DWH (copy from a view in stage). I have tried to add partitions. I didn't help that much, so I removed it again.But since I removed it, the table can't be deployed. I get the below error. I can re-add the partition, but still the same error: An error occurred during drop partition function. See exception details for the failing objectAn error occurred during drop user defined function. See exception details for the failing object: Drop failed for PartitionFunction 'FactSalesOrder_History_PartitionFunction'.An exception occurred while executing a Transact-SQL statement or batch.Partition function 'FactSalesOrder_History_PartitionFunction' is being used by one or more partition schemes. Details: SQL Server: '.'SQL Procedure: ''SQL Line Number: 2SQL Error Number: 7706 I'm not sure how to find and drop what-ever TX is complaining about.Tried this:DROP PARTITION FUNCTION FactSalesOrder_History_PartitionFunction and then get this:Partition function 'FactS
I am using the Field Validation to test our ETL. Using this as source data:And these rules in the DSA:And this query to check for errors:I get the following result:The _L table is linking to the records in the _R table that were removed when they failed the validation step on FIRST_NAME. Also the Warning level notification is not present in the table.As it stands the _L, _R, and _M tables as joined above are showing incorrect or at least confusing information. Am I using them as they were intended to be used, and has anyone done anything to get around this?Thanks,Mark
hi am trying to update my CData Datasource providers (Update available) but when i click on the update button, this is what i get: Unable to update the provider.The provider is currently locked by the application. Details: Unable to update the provider. ...Module: timeXtenderSystem.Exceptionat TimeXtender.DataManager.CDataComponentInstallHelper.UninstallComponent(DataSourceComponentModel dataSourceComponentModel, Boolean isUpdate)at TimeXtender.DataManager.ManageCDataDataSourceProvidersCommand.UninstallProvider(Form parentWindow, CDataComponentModelWrapper componentModelWrapper, Boolean isUpdate) Time: 2020-12-09 09:00:29UTC: 2020-12-09 08:00:29Title: CloudDWH - TimeXtender 126.96.36.199Application: 188.8.131.52Repository: 184.108.40.206 (in Azure)SQL Server: Microsoft SQL Azure (RTM) - 12.0.2000.8 Oct 1 2020 18:48:35 Copyright (C) 2019 Microsoft Corporation Any idea what is causing this and what to do to make it work? thanks! M
Currently, we are rendering an error when we want to do a Transfer task for with the ODX server. We are trying to read an PxPlus data source via an ODBC driver, TimeXtender correctly shows all available tables and columns, and synchronizing also seems to work fine. However, when we try to transfer the data the following error is returned:The source does allow any schema's, and it seems that TimeXtender nevertheless tries to envoke a query with a schema (I think the error is returned because of the . before the tablename). We have tried some things in the Query formatting and Character Replacement, but we can't get it to work properly. TimeXtender version: 220.127.116.11ODX Server version 20.10.1
Currently we are facing a problem with memory resources when we're trying to do a data export. When writing a huge file, 4 gb, the memory allocated to this TX process takes up more resources than we would like (> 22gb of RAM). Is there any possibility for breaking up the data export in partions, such that the process takes up less RAM? I would think there would be some kind of option like the data batch cleansing in the table settings, but I can't seem to find it anywhere. Within the data export we use the TimeXtender File Export 18.104.22.168 provider.
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.