Solved

ODX to Data Warehouse - SqlException - Error code 4815

  • 11 September 2022
  • 6 replies
  • 139 views

After trying to import data from a REST source in the ODX I got an SQL Exception with error code 4815.

After a bit of Google I ended up on this page, telling me that it had something to do with SQLBulk copy not able to match the columns in source and destination or the value of a column was larger than in the destination allowed (e.g. string with length of 200 -> nvarchar(100)).

Because TimeXtender defines the columns it probably didn't have something to do with that.

Because the RSD I created didn't have a column size specified, all my string type field allowed 2000 chars. Thinking that none of the string fields were going to be larger, I started changing int to double and after that all the fields to string and later even to text. None of this worked which send me back to start. After properly examining the parquet file I found 1 record that had a text field that exceeded the 2000 char limit.

The next time I get this error I will do the following steps:

1. Examine data type of API (OpenApi definition, documentation or inside the application itself)
2. Examine data return by API in Parquet file (viewer)
    This viewer has the option of sizing the columns based on the value. This way you can easily see which one is large and may need a MAX length.
3. Change the correct column to xs:type="text" and "columnsize="0" which seems to change it to MAX.

After all that refresh by doing a new Synchronize, recreate the table with the new schema, deploy and execute the changes and then you are done... hopefully..

<attr name="example" xs:type="text" columnsize="0" readonly="false" other:xPath="/json/data/example" />

 

icon

Best answer by sanderuden 13 September 2022, 16:42

View original

6 replies

This method also works if you run into the same error code with a different data source (e.g. Dynamics 365). Only step 3 is different because you need to change de type of the created column in the  DSA or DW.

Userlevel 6
Badge +5

Hi Sander
When you add a column size it will utilize that instead of the default column size. You can control this as explained here
legacysupport.timextender.com/hc/en-us/articles/360001029663-Add-a-CData-data-source#default-data-type-value-lengths

Also since 0 seems to mean an indefinite number you can use that, but in the past I used 8000 as it also made NVarChar(MAX) data types.

 

Hi Thomas,

That is good to know but not always a solution. In my case I needed to change the size to MAX which should never be a default because it is stored in a different way and slows down performance. (more info)

Question, is it also possible to change the default datatype for string from varchar to nvarchar? I know it's stored using twice the bytes, but we would consider this per connection.

Userlevel 6
Badge +5

Hi Sander
It does not look like there is an option to do this conversion.
It is also hard to do a transfer to nvarchar if all fields have different lengths without having to convert each field individually.

Here is the options for the input area and fields in it.

Input, Output, and Column Parameters

The api:info keyword has additional parameters contained within its scope that define columns as well as script inputs and outputs. Note these parameters are not API Script keywords, but additional information for api:info.

attr

The attr parameter describes a table column and includes at least two attributes: name and data type. The name must be an alphanumeric string.

Other optional attributes provide more information about the column and enable the provider to represent these columns correctly in various tools.

<attr
  name="Name"
  xs:type="string"
  other:xPath="name/full"
  readonly="true"   
  columnsize="100"
  desc="The name of the person."
/>
  • name: The alphanumeric string that defines the name of the column.
  • xs:type: The data type of the column. The string, int, double, datetime, and boolean types are supported.
  • other: Attributes prefixed with 'other:' that provide extra information. These other properties can be operation specific. For example, other:xPath specifies the XPath to a node in a local or remote resource. The XPath can be relative to a RepeatElement.
  • desc[ription]: A short description of the column.
  • key: Whether the column is part of the primary key in the table.
  • readonly: Whether the column can be updated. Allowed values are true and false.
  • req[uired]: Whether the column must be specified in an insert. Allowed values are true and false.
  • def[ault]: The default value for the column if none is specified.
  • values: A comma-separated list of the valid values for the column. If specified, the engine will throw an error if the specified column value does not match one of the allowed values.
  • references: The foreign key to the primary key of another table. The foreign key is specified with the following syntax: table.key. For example: "Employees.EmployeeId".
  • columnsize: The maximum character length of a string or the precision of a numeric column. The precision of a numeric column is the number of digits.
  • scale | decimaldigits: The scale of a decimal column. The scale is the number of digits to the right of the decimal point.
  • isnullable: Indicates whether the column accepts null values. Note that this does not prevent sending or receiving a null value.
Userlevel 5
Badge +7

the CData documentation for .rsd makes me feel it should somehow be possible to apply a truncate(n) string formatter on the value in a column which could enforce a maximum length. The script may be a bit involved though, might by a nice tutorial to add to the .rsd faq.

Hi Rory,

what you are suggesting is possible if you specify the output by creating a new object.

Example:

<api:call op="jsonproviderGet" in="callIn" out="callOut">
    <api:set attr="output.truncatedString" value="[callOut.reallyLongString | substring(0,2000)]"/>
    <api:push item="output" />
</api:call>

I would NOT advice anyone to do this unless necessary because it results in an incomplete dataset.

 

Reply