Skip to main content

Azure Data Factory Oracle Data Source - Error executing meta data pipeline


Forum|alt.badge.img

Symptoms:

The following error occurs with a new Azure Data Factory - Oracle data source when executing the Metadata Transfer task in the TimeXtender Data integration Desktop Ingest Instance.

Exception Type: System.Exception
Message: Error executing meta data pipeline:
Input: {
"source": {
"type": "OracleSource",
"oracleReaderQuery": "SELECT \r\n c.OWNER,\r\n c.TABLE_NAME,\r\n c.COLUMN_NAME,\r\n COALESCE\r\n (\r\n (\r\n SELECT \r\n 1\r\n FROM \r\n
Error: {
"errorCode": "2200",
"message": "Failure happened on 'Source' side. ErrorCode=UserErrorFailedToConnectOdbcSource,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ERROR [08S01] [Microsoft][ODBC Oracle Wire Protocol driver]Socket closed.\r\nERROR [08001] [Microsoft][ODBC Oracle Wire Protocol driver][Oracle]ORA-12203: unable to connect to destination,Source=Microsoft.DataTransfer.ClientLibrary.Odbc.OdbcConnector,''Type=System.Data.Odbc.OdbcException,Message=ERROR [08S01] [Microsoft][ODBC Oracle Wire Protocol driver]Socket closed.\r\nERROR [08001] [Microsoft][ODBC Oracle Wire Protocol driver][Oracle]ORA-12203: unable to connect to destination,Source=,'",
"failureType": "UserError",
"target": "Copy Table",
"details": []
}

 

Cause:

There may be more than one cause of the Azure meta data pipeline failing with the “[Oracle]ORA-12203: unable to connect to destination,Source=,'" error. Use the following steps to see what specific error is causing the pipeline to fail.

  1. Log onto portal.azure.com
  2. Click on the Azure Data Factory resource being used by your TimeXtender Azure Data Factory - Oracle data source.
  3. Click on the “Launch Studio” button to launch the Azure Data Factory Studio.
  4. Click on the “Monitor” icon in the Accordion on the left-hand pane.
  5. Click on “Pipeline runs” in the contents section on the left-hand side.
  6. Locate your Failed pipeline run and click on the “Error” icon next to the “Failed” to review the specific error information available.

 

 

Resolution:

The following error information is one example of an issue that can cause the meta data pipeline to fail, and indicates that the Java Runtime Environment needs to be installed on the system where the Integration Runtime is running in order for it to be able to create parquet files. The specific error in the screenshot above states the following:

Operation on target TIMEXTENDER TRANSFER DATA LOOP failed: Activity failed because an inner activity failed; Inner activity name: Copy Table, Error: ErrorCode=JreNotFound,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Java Runtime Environment cannot be found on the Self-hosted Integration Runtime machine. It is required for parsing or writing to Parquet/ORC files. Make sure Java Runtime Environment has been installed on the Self-hosted Integration Runtime machine.,Source=Microsoft.DataTransfer.Common,''Type=System.DllNotFoundException,Message=Unable to load DLL 'jvm.dll': The specified module could not be found. (Exception from HRESULT: 0x8007007E),Source=Microsoft.DataTransfer.Richfile.HiveOrcBridge,'

The Java Runtime Environment can be downloaded at the following link.

https://www.java.com/en/download/

 

After installing the Java Runtime Environment, execute the Metadata Transfer task again and verify that it can now complete successfully.

In the event that you have some other error and need to troubleshoot your setup further, the following steps explain how to get more detailed information regarding a pipeline run.

Begin by clicking on the pipeline guid name and then clicking on the “eyeglasses” icon next to each section of the run.

 

Clicking on the eyeglasses icon may display something similar to the following:

Note that the name of the integration runtime being used is visible here, so you can confirm that this integration runtime is setup properly in both the Azure and TimeXtender Portals.

The Integration Runtime will also need to be associated with a linked service to the Oracle database server, which should be setup similar to the following:

You may need to verify the following regarding the Oracle linked service.

  1. It is associated with the correct integration runtime
  2. The authentication credentials entered are correct
  3. Performing a Test Connection indicates it is successful.

If the Test Connection is not successful, review the following two KB Articles for more information on the setup of Azure Data Factory and Oracle Data Sources:

Use Azure Data Factory for Data Movement 

TimeXtender Oracle Data Source

 

Did this topic help you find an answer to your question?

0 replies

Be the first to reply!

Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings