Type String from the data source cannot be converted to type nvarchar

I am banging my head against the wall for this one for some while now. I would appriciate some help.

Here is the RSD-file

<api:script xmlns:api="" xmlns:xs="">

<!-- See Column Definitions to specify column behavior and use XPaths to extract column values from JSON. -->
<api:info title="data" desc="Generated schema file." xmlns:other="">
<!-- You can modify the name, type, and column size here. -->
<attr name="bildeLinke" xs:type="string" readonly="false" other:xPath="/json/data/bildeLinke" />
<attr name="bygning" xs:type="string" readonly="false" other:xPath="/json/data/bygning" />
<attr name="bygningsId" xs:type="string" readonly="false" other:xPath="/json/data/bygningsId" />
<attr name="del" xs:type="string" readonly="false" other:xPath="/json/data/del" />
<attr name="delSeq" xs:type="string" readonly="false" other:xPath="/json/data/delSeq" />
<attr name="dokumenterLinke" xs:type="string" readonly="false" other:xPath="/json/data/dokumenterLinke" />
<attr name="fag" xs:type="string" readonly="false" other:xPath="/json/data/fag" />
<attr name="forklaringKode" xs:type="string" readonly="false" other:xPath="/json/data/forklaringKode" />
<attr name="forvaltningsenhet" xs:type="string" readonly="false" other:xPath="/json/data/forvaltningsenhet" />
<attr name="forvaltningsenhetId" xs:type="string" readonly="false" other:xPath="/json/data/forvaltningsenhetId" />
<attr name="forvaltningsenhetTittel" xs:type="string" readonly="false" other:xPath="/json/data/forvaltningsenhetTittel" />
<attr name="kode" xs:type="string" readonly="false" other:xPath="/json/data/kode" />
<attr name="skaringFarge" xs:type="string" readonly="false" other:xPath="/json/data/skaringFarge" />
<attr name="skaringSporsmal" xs:type="string" readonly="false" other:xPath="/json/data/skaringSporsmal" />
<attr name="skaringSporsmalIkkeAktuell" xs:type="string" readonly="false" other:xPath="/json/data/skaringSporsmalIkkeAktuell" />
<attr name="skaringSporsmalKode" xs:type="string" readonly="false" other:xPath="/json/data/skaringSporsmalKode" />
<attr name="skaringSporsmalSeq" xs:type="string" readonly="false" other:xPath="/json/data/skaringSporsmalSeq" />
<attr name="skaringSvarKode" xs:type="string" readonly="false" other:xPath="/json/data/skaringSvarKode" />
<attr name="skaringSvarText" xs:type="string" readonly="false" other:xPath="/json/data/skaringSvarText" />
<attr name="spmFarge" xs:type="string" readonly="false" other:xPath="/json/data/spmFarge" />
<attr name="sporsmal" xs:type="string" readonly="false" other:xPath="/json/data/sporsmal" />
<attr name="sporsmal2" xs:type="string" readonly="false" other:xPath="/json/data/sporsmal2" />
<attr name="sporsmalFritekst" xs:type="string" readonly="false" other:xPath="/json/data/sporsmalFritekst" />
<attr name="sporsmalIkkeAktuell" xs:type="string" readonly="false" other:xPath="/json/data/sporsmalIkkeAktuell" />
<attr name="sporsmalKode" xs:type="string" readonly="false" other:xPath="/json/data/sporsmalKode" />
<attr name="sporsmalSeq" xs:type="string" readonly="false" other:xPath="/json/data/sporsmalSeq" />
<attr name="status" xs:type="string" readonly="false" other:xPath="/json/data/status" />
<attr name="svarKode" xs:type="string" readonly="false" other:xPath="/json/data/svarKode" />
<attr name="typeKartlegging" xs:type="string" readonly="false" other:xPath="/json/data/typeKartlegging" />
<attr name="typeKode" xs:type="string" readonly="false" other:xPath="/json/data/typeKode" />

<api:set attr="in.Header:Name#" value="Accept" />
<api:set attr="in.Header:Value#" value="application/json" />
<api:set attr="in.DataModel" value="DOCUMENT" />
<api:set attr="in.JSONPath" value="$.data" />
<api:set attr="in.EnablePaging" value="TRUE" />
<api:set attr="in.Page" value="1" />
<api:set attr="in.PageCount" value="17" />
<api:set attr="in.BaseURI" value="{page}" />
<api:set attr="in.ElementMapPath#" value="/json/meta/pagination/pages" />
<api:set attr="in.ElementMapName#" value="pages" />

<api:script method="GET">
<api:set attr="in.URI" value="[in.BaseURI | replace('{page}', 1)]"/>
<api:call op="jsonproviderGet" in="in" out="out">
<api:set attr="in.PageCount" value="[out.pages]"/>

<api:enum range="1..[in.PageCount]">
<api:set attr="in.Page" value="[_value]"/>
<api:set attr="in.URI" value="[in.BaseURI | replace('{page}', [in.Page])]"/>
<api:call op="jsonproviderGet" in="in" out="out">
<api:push item="out"/>

Note that column is type STRING eventhough the JSON faile returns a mix of INT and STRING

When i run this in TX i get the following error eventhough all the columns are STRING

The given value of type String from the data source cannot be converted to type nvarchar of the specified target column.
String or binary data would be truncated.


String or binary data would be truncated.
Module: System.Data
at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData metadata, Boolean isNull, Boolean& isSqlType, Boolean& coercedToDataFeed)

The given value of type String from the data source cannot be converted to type nvarchar of the specified target column.
Module: System.Data
at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData metadata, Boolean isNull, Boolean& isSqlType, Boolean& coercedToDataFeed)
at System.Data.SqlClient.SqlBulkCopy.ReadWriteColumnValueAsync(Int32 col)
at System.Data.SqlClient.SqlBulkCopy.CopyColumnsAsync(Int32 col, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.CopyRowsAsync(Int32 rowsSoFar, Int32 totalRows, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinued(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsync(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource`1 source)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken)
at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken)
at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader)
at TimeXtender.DataManager.StepTransferSSISExecute.BulkCopySource(Boolean executeIncrementalLoad, DataSource dataSource, VariableResolveObject dynamicResolveObject)
at TimeXtender.DataManager.StepTransferSSISExecute.DoStageDataSource(Boolean isIncrementalAllowed, DataSource dataSource, StepSetup stepSetup, VariableResolveObject dynamicResolveObject, Boolean loadPrimaryKeysOnTransfer)
at TimeXtender.DataManager.StepTransferSSISExecute.DoStage(StepSetup stepSetup, VariableResolveObject dynamicResolveObject)

When i use real page value in page (1,2,3 and so on) as under, there is no error and the table is populated

	<api:set attr="in.BaseURI" value="" />

The header in POSTMAN look this:

    "code": 200,
    "meta": {
        "pagination": {
            "pages": 17,
            "page": 2,
            "limit": 5000,
            "total": 83482

The RSD code above works fine for other Tables. 



Best answer by Thomas Lind 30 May 2024, 14:36

View original

10 replies

Userlevel 6
Badge +5

Hi @aftabp 

A guess is that it has an issue with the length of one of them.

Add this to the other field in the setup of the REST provider and synchronize after.



This should make it perceive all string fields as nvarchar(max) then it should not complain.

Then check what field contains the large value that makes it fail, by default all strings can max contain 2000 characters.

Userlevel 6
Badge +7


I would advise you to specify the proper length for each column if you can. The SQL engine will reserve resources based on the column length, for a varchar SQL expects an average of 50% of the length of the field. If you have many fields containing only a few characters SQL will reserve memory for 50% * 4000 characters for each field in your queries. Some fields may be unlimited in length in your source, in that case I would adapt the .rsd to take the LEFT x character to ensure you do not get bogged down performance-wise. Alternatively, if you really need large text content, you can take the key fields + a large field as varchar(max) in a separate table.

@Thomas Lind 

Thank you for your reply.

I have tried your suggestion but got the same result. I am trying now to drop text fields that might be longer than 4000 characters.

There is another test i did. I tried extracting data for one page at a time which went fine. After i did it for first three pages individually i tried bulk extraction:

<api:set attr="in.BaseURI" value="{allPages}" />

This failed which is strange because extraction worked fine individually for one page at a time.

Userlevel 6
Badge +5

Hi @aftabp 

It is not possible to apply pagination then? I mean that is essentially one page at a time.

@Thomas Lind 

The pagination is already there.

However just now it worked. I don’t know if it’s the vendor who did something, your suggestion from 3 days ago which i applied before the weekend or a combination of both.


Userlevel 6
Badge +5


The vendor could have changed the source data that gave the issue. I would imagine that was the cause of this in the first place. All my suggestions was about was to figure out what caused the issue, but if someone changed what failed on the other end it would just start working without you knowing what gave the issue.

l contacted the vendor first before posting the issue here. The vendor denied any changes. I will try to figure out what fixed the issue.

On the other side, i was unaware of the upper limit for the fields in TX. 

Thank you for all the suggestions.

Userlevel 6
Badge +5

Hi @aftabp 

It is not a limit we enforce, it is a CData limit where the default size always resorts to 2000 and with RSD files it always uses the default size unless specifically set.

Userlevel 6
Badge +5

Hi @aftabp is the issue resolved? Please let us know if you have any follow up questions

@Christian Hauggaard 
Yes it is.
I did as Thoams suggested but it didnt work in the first place. Later in the day it worked. 
I guess the vendor also changed something in between the two tries
Anyways, the DefaultColumnSize is still set to 4000 as Thomas suggested