Skip to main content
Question

Failed to read metadata from XmlDocument for endpoint X with transformation 'T' Input string was not in a correct format.

  • 16 September 2024
  • 8 replies
  • 38 views

The sync of my ODX datasource returns the following error. 
 

The datasource is a REST API endpoint of the form URL/{foreign_ID}, where foreign_id is a dynamic value that comes from a different endpoint within the same data source. The option "Perform exhaustive meta data scan” is enabled. The error returned is this:

Failed to read metadata from XmlDocument for endpoint X with transformation 'T'
Input string was not in a correct format.

There is no further information. 

 

The sync does work when I replace foreign_ID by a fixed value. 

 

The endpoint contains a field that has a lot of values that look like integers, but some values contain letters. Since the above error message is the most verbose info I’ve got, I assumed that's probably the cause. However this conflicts with my understanding of what an “exhaustive meta data scan” should do. 

 

Any help is appreciated. 

Dear @Benny ,


What I would do is ‘ turn on’  the logging of the connector. you can Find the logging in the settings. Enter a folder and name for the logging. set verbosity to like 3 or 4 (maybe 5 but thats really verbose) and run the sync. This might help understanding the issue better or it easier to check stuff.

TX shows this error because the API needs an input in the filter section. Which, if not given, it will give you this error message. You;ve done nothing wrong, but this is just how API’s work.

i’m not really sure what you mean with the part about the exhaustive meta data scan. are you afraid TX does not scan the correct datatypes?

Hope this helps

= Daniel


Hi @Benny 

How is the source endpoint with the dynamic value set up and are you sure it is foreign_ID and not foreign_Id or foreign_id that the name has in there.


Hi @Thomas Lind, the dynamic value is correctly cased. 

The endpoint is set up like so:

  • dynamic path like xxx/{foreign_ID}
  • Table flattening with xslt
  • Only list flattened tables
  • HTTP method: GET
  • Perform exhaustive meta data scan
  • Dynamic values: enabled (pointing to correct endpoint)
  • Override headers: Accept = application/xml

The problem seems to be in the determination of the datatypes. I suspect this, because I managed to create a quick-fix by changing the xslt. The column I suspected to cause problems was altered to  <xsl:value-of select="concat('X--', $pathTU/@ArticleNumber)" />, i.e. I forced an obvious string in front of every value. The sync then runs flawlessly and deduces an NVARCHAR(500) datatype for the column. When I remove the concatination, I get the error.

As I said, a lot of the ArticleNumbers are castable to INT, but not all. This seems to confirm my suspicion that the datatype is deduced based on a subset, in spite of my “exhaustive meta data scan” setting. 


Hi @Benny 

When you did the exhaustive meta data scan did the data types change for the fields in the source endpoint with the dynamic value?


@Thomas Lind, I tried once where it did change and once where it didn't. Both tries had the same error as result. 


Hi @Benny 

What did it change from and to and was it specifically the field foreign_ID? When it changed back, did you do anything to the setup to make this happen?

 

Have you tried to apply a datatype override to this specific field to make it a nvarchar(50) or similar and did that also make it fail?


The initial and intended setting is URL/{foreign_ID} (so foreign_ID is dynamic and comes from a different endpoint). 
I looked up two specific foreign_ID's: one where ArticleNumber was castable to INT and one where it wasn't. Lets say I that foreign_ID = x has ArticleNumber = 800900 and foreign_ID = y has ArticleNumber = A12345.

 

I then changed the endpoint: once to URL/x and once to URL/y. Both times the sync ran fine. Both times also did a transfer and checked the datatype for ArticleNumber. The first time it was INT and the second time it was NVARCHAR(500). Changing the enpoint back to use a dynamic foreign_ID gave the error again. This suggests it decides that ArticleNumber is an INT and ArticleNumbers like A12345 then cause this error. I have no way to check this, beacause I don't have easy acces to the result of the sync. 

 

I have not yet tried to override datatypes. 


Hi @Benny 

If you do get access to the data source in TimeXtender Data Integration you should try to make a rule to make that field a nvarchar, no matter what it thinks it is.

That said it also should give an issue if it thinks that the field is a int if you were to use the endpoint in a data area.


Reply