Solved

Error in custom data varbinary values

  • 9 June 2023
  • 7 replies
  • 132 views

In Custom data insert i have one field which is varbinary(8000) data type. 

When i try to insert a column i get an error:

 

The given value of type String from the data source cannot be converted to type varbinary of the specified target column.
Failed to convert parameter value from a String to a Byte[].
Invalid cast from 'System.String' to 'System.Byte[]'.

Details:

Invalid cast from 'System.String' to 'System.Byte[]'.
Module: mscorlib
System.InvalidCastException
at System.Convert.DefaultToType(IConvertible value, Type targetType, IFormatProvider provider)
at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType, Boolean& coercedToDataFeed, Boolean& typeChanged, Boolean allowStreaming)

Failed to convert parameter value from a String to a Byte[].
Module: System.Data
System.InvalidCastException
at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType, Boolean& coercedToDataFeed, Boolean& typeChanged, Boolean allowStreaming)
at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData metadata, Boolean isNull, Boolean& isSqlType, Boolean& coercedToDataFeed)

When i try to custom insert a varbianry data i get error by deploy.

Error code:
The given value of type String from the data source cannot be converted to type varbinary of the specified target column.
Module: System.Data
System.InvalidOperationException
at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData metadata, Boolean isNull, Boolean& isSqlType, Boolean& coercedToDataFeed)
at System.Data.SqlClient.SqlBulkCopy.ReadWriteColumnValueAsync(Int32 col)
at System.Data.SqlClient.SqlBulkCopy.CopyColumnsAsync(Int32 col, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.CopyRowsAsync(Int32 rowsSoFar, Int32 totalRows, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinued(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsync(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken)
at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken)
at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table, DataRowState rowState)
at TimeXtender.DataManager.StepTableStructureCustomDataDeploy.DoAbstractStep(StepSetup stepSetup)

 

I would need some help here.

 

Br

icon

Best answer by rory.smith 12 June 2023, 08:54

View original

7 replies

Userlevel 4
Badge +5

Hey @kkosir ,

It is hard yo judge from just the error message. Can you share some more info or screenshots?

My gut feeling tells me that somewhere SQL identifies the varbinary as a string and the conversion back to varbinary goes wrong. 
Or maybe the data type of the destination column is not the same as the source data type?

= Daniel

Hi sure,

 

My table has multiple varbinary fields they contain _HK at the end.The table is already populated but from sql directly (insert). 

We dont want to users that will be forced to create insert statements but i want to transfer data over custom insert in to table. As this works fine in TX.

But when trying to insert binary data over custom insert this does not work. I assume there is a bug in code as binary data is being represented as a string but in reality it is not.

 

Br

Userlevel 4
Badge +5

Dear @kkosir ,

I have found that the column order should be taken into account.

So when you are inserting a query and the coulmn order of the query does not match the column order of the destination table, TX inserts the data but it might be in a wrong column.

To be honest, I dont really like to use the table inserts excepts when I have to and then still use a view.

Why are you using the table insert? Isn’t there a other (TX native) way to do this?

= Daniel

Order is correct …. we use this function a lot for mapping tables in dwh.

Userlevel 6
Badge +5

Hi @kkosir I have managed to reproduce the issue. As a workaround, could you please try adding a query table to your ODX data source similar to the below where you cast the column as varbinary?

 

Userlevel 5
Badge +7

Hi @kkosir ,

 

perhaps explicitly converting the literal to varbinary in the Custom Table Insert would work:SELECT CONVERT(VARBINARY, '0x00173672') AS [BinaryTest] works for me on a varbinary(32).

@daniel : you can actually use the INSERT column definitions to deal with the ordering in your SELECT and also define ignoring the TX system fields.

Userlevel 6
Badge +5

Hi @kkosir were you able to implement a workaround for this issue?

Reply