Skip to main content

Good Day, 

 

We are receiving the following error messasge: 

Finished executing project 'COMPANY' Execution Package 'Client execution'
Execution failed
Start Time : 5/24/2024 7:57:22 AM End Time : 5/24/2024 7:57:22 AM on server: NAME
-Execute Execution Package Client execution 'Failed'
    The conversion of a nvarchar data type to a datetime data type resulted in an out-of-range value.
    
    Details:
    
    SQL Server: 'NAME'
    SQL Procedure: ''
    SQL Line Number: 1
    SQL Error Number: 242
    
    The conversion of a nvarchar data type to a datetime data type resulted in an out-of-range value.
    Module: .Net SqlClient Data Provider
    System.Data.SqlClient.SqlException
       at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
       at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)
       at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)
       at System.Data.SqlClient.SqlDataReader.TryHasMoreRows(Boolean& moreRows)
       at System.Data.SqlClient.SqlDataReader.TryReadInternal(Boolean setTimeout, Boolean& more)
       at System.Data.SqlClient.SqlDataReader.Read()
       at System.Data.SqlClient.SqlBulkCopy.ReadFromRowSourceAsync(CancellationToken cts)
       at System.Data.SqlClient.SqlBulkCopy.CopyRowsAsync(Int32 rowsSoFar, Int32 totalRows, CancellationToken cts, TaskCompletionSource`1 source)
       at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinued(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
       at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsync(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
       at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource`1 source)
       at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource`1 source)
       at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken)
       at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken)


       at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DbDataReader reader)
       at TimeXtender.DataManager.StepTransferSSISExecute.DoDataWarehouse(StepSetup stepSetup, VariableResolveObject dynamicResolveObject)
       at TimeXtender.DataManager.StepTransferSSISExecute.DoAbstractStep(StepSetup stepSetup)

 

Once we are disabling the load of this table, the load is executed succesfully. If it is enabled, the load is not successfull. 

How can we re-solve this issue? Thanks in advance. 

Hello @Marc ,

It is probebly due to a wrong date entry in that source. The data that has been entered is incorrect. For a nvarchar sql does not care but in the conversion to a date time it does and throughs the error. 
In what part of the load proces (odx / dsa / mdw) does this error occur?

if it is in the odx then just keep the nvarchar and then use a try_cast(tdate] as Datetime) (for instance) in a field transformation to make it into a date time without error. Afterwards you can find the wrong date by looking up empty cells in the colmn with the date and comparing it with the odx table (or load the same column twice and keep onr as nvarchar) and check whats wrong and have it fixed in the source

 

if it happens in the odx then take of the ‘override datatype) so the error does not happen

 

hope this helps

= Daniel


@Marc  like Daniel said you can do that,

but if you aren't able to change the source you can create a new field where you put in the trycast of your field and use the old field as the input field, then right click on the original field and put it as raw field only so it only exist in your raw table.

 


Hi @Marc 

Besides the suggested I have also found that the culture settings of the server can give issues.

So how America has mm/dd/yyyy as a date and Europe has dd/mm/yyyy and that conversion would fail if the date is too large.


That can be a real issue, Thanks @Thomas Lind for the addition. Can this also be an issue when loading a table from a database or will the usually be an issue when load from flat file / csv and the likes?


Hi @daniel 

Yes, generally it is an issue for file based data sources, but if it is doing conversions it can happen for others.

If the data source reads a date and is basing it on the different culture settings it will generally see it as a string unless you state a source culture type in the setup.

Once it sees it as datetime it should not give any issues as it follows a different standard than culture.


@daniel , Thank you for your support. 

Unfortunatly, the proposed solution doesn't work. It occurs in the transformation layer, whereby we only have 2 fields that are converted from nvarchar to datetime.

Are there any other potential solutions? 

The data was loaded successfully till the 16th of May. This situation came up. 

@Thomas Lind , most likely it is not a cultural issue as this issue would likely happend earlier. 

Appreciated your support in here.

Regards, Marc


@Marc You can try the raw only solution with a new field, and then just split it with a substring. It's tedious and then use create DATEFROMPARTS(year, month, day) and fill it in with your substring parts. It's not the prettiest solution but it might just work for you.


Hi,

I would look at the raw data before the string to datetime conversion and find out what record(s) are causing this. It may very well be that data has been added to the source in an incorrect format, which should be solved at the source instead of covering the problem up with SQL.

If you can load the table as nvarchar fields you can look at the different values and you will probably quickly spot the outliers. Otherwise filtering down records until you are left with a small failing set works well.


@Thomas Lind , @rory.smith , @daniel , @Bernarddb , Thank you all for your valuable input. The client made changes to the raw data causing this issue. Situatioin has been re-solved. Appreciated your valuable input in here. 

Enjoy the remainder of your day.

 

Regards, Marc


Hello @Marc ,

Perfect. The best best way is to solve it in the source.

I love it when problems solve themselfs :)

Take care


Reply