TimeXtender API Endpoints

TimeXtender API Endpoints
Userlevel 4
Badge +5

This article describes the various endpoints within the TimeXtender API and how to send requests using the TimeXtender Postman collection. To use the collection, download and install Postman, download the collection, open Postman and import the collection into Postman.


The API Key needs to be setup in order to use the TimeXtender API endpoints.

  1. In Postman, select the Public - Jobs collection that has been imported in the left sidebar
  2. Select the Variables tab
  3. Enter the API key in the Current value for the variable apiKey (for more info on how to create an API key see API Key Management)

Once entered, the apiKey variable is used for the entire collection, as the variable is referenced in the request headers for the various calls. Alternatively, the variable above can be entered manually within the request headers for the individual calls. For example, the {{apiKey}} variable for the job status request can be replaced with the actual API key.


The following variables are included in the Postman collection:

  • ApiKey - the API key that you have created in the Portal for your organization
  • JobId - the Id of the job you want to execute, check status of, etc. Use the "Get all jobs" endpoint in order to retrieve the Id for your jobs
  • TriggerId - the trigger Id of a specific execution of a job. This Id is returned when running a POST call to execute a job
  • Domain - the base domain of the TimeXtender API gateway serving the public API

As mentioned in the Prerequisites section above, ApiKey must be added in the variables section in Postman. For some of the endpoints, JobId and TriggerId variables also need to be added by modifying the current value in the Variables section. 

TimeXtender API Endpoints

Get All Jobs Endpoint

GET /public/jobs

This endpoint retrieves a list of all jobs for your organization. Each job consists of the following values:

  • Id (guid) - the Id of the job. This will be used for most of the remaining endpoints as an identifier
  • Name (string)  - the name of the job

Execute Job Endpoint

POST /public/jobs/{job_id}/execute

This endpoint will queue a job for execution by setting its status to "pending". The job will then be picked up by the application, at which point it will start the job execution along with execution logs.
If successful, an object will be returned with the following properties:

  • JobId (guid) - the Id of the job which has been queued to execute.

  • Message (string) - a message explaining that the job has been started.

  • TriggerId (guid) - a trigger id is unique to an execution and will be attached to the created logs once the job has been started. It can be used to filter logs.

Get Job Status Endpoint

GET /public/jobs/{job_id}/status

This endpoint can be used to check the current status of a job. The following status codes exist:

  • 0 - Idle
  • - Pending
  • 2 - Running

A job will be "idle" until it has been executed. It will then go to status "pending". Once picked up by the application, it will change to status "running". When the execution finishes (or fails), the job will return to status "idle".
The status endpoint will return an object with the following properties:

  • JobId (guid) - the Id of the requested job.
  • Status (int) - the status code.
  • Message (string) - a message explaining what the status code means.

Get All Job Statuses Endpoint

GET /public/jobs/status

This endpoint will return a list of statuses for all jobs in an organization. The status objects in the list will look identical to the one from the Get Job Status endpoint.
The all jobs statuses endpoint will return an array of objects with the following properties:

  • JobId (guid) - the Id of the requested job.
  • Status (int) - the status code.
  • Message (string) - a message explaining what the status code means.

Get Job Logs Endpoint

GET /public/jobs/{job_id}/logs

This endpoint returns an array of job executions, as well as logs for a specific job.
job execution has the following properties:

  • Id (guid) - the Id of the job execution.
  • JobId (guid) - the Id of the job that has been executed.
  • State (int) - the state of the execution.
    • The following states exist:
      • -1 - none
      • - running
      • - completed
      • - failed
  • CreateTime (string) - the create time of the execution.
  • EndTime (string) - the end time of the execution (NULL if the execution is still running)
  • TriggerId (guid) - the trigger id of the execution (empty guid if started from the application)
  • JobExecutionLogs (array) - an array of job execution logs.

job execution log has the following properties:

  • Id (guid) - the Id of the log.
  • JobExecutionId (guid) - the Id of the job execution which the log pertains to.
  • TimeStamp (string) - a timestamp of when the log was created.
  • Severity (int) - the severity of the log.
  • Message (string) - a message explaining what has happened in the log.

Get Job Logs for a Trigger Id Endpoint

GET /public/jobs/{job_id}/logs/{trigger_id}

This endpoint returns a similar result as the Get Job Logs endpoint, except it will only return logs with a trigger Id matching the one given as a route parameter. In other words, this endpoint allows filtering of logs based on a trigger Id.



6 replies


This is great and exactly what we have been waiting for! The Get Job Logs Endpoint will be especially useful for us since we have our own monitoring app in Power BI using this info.

Userlevel 2
Badge +1

Hi @Christian Hauggaard ,

I like to create a table in TimeXtender that contains the last date and time a datasource in the ODX Server is successfully loaded. 

I'm looking for a API endpoint that can give me the last successful data transfer per datasource in the ODX Server. The Job Logs endpoint give me some insights, but I cannot figure out which datasource/data transfer task the Job Log belongs to.

Userlevel 4
Badge +5

@bas.hopstaken the Job Logs Endpoint refers to the name of the transfer task in the message (although not the data source). So if you extract this data you should be able to find out which transfer task completed successfully at which time. As a workaround, if you name the transfer task so that it contains the data source name, then you should be able to tell which data source the transfer task belongs to. Please feel free to submit an idea here.  


Userlevel 2
Badge +1

@Christian Hauggaard : I will rename my transfer tasks to include the data source name. But nevertheless it will be great if we can extract the status/execution logs from the ODX tasks and MDW/SSL instance Execution Packages ass well, using the API.

Hi @Christian Hauggaard ,

I would like to set up a new Data Source in TimeXtender which extracts data from the TimeXtender API Endpoints. However my knowledge in the new version of TimeXtender is quite limited, especially when it comes to configuring REST API data sources in the portal.

I have managed to extract the get job logs in postman in accordance with your guide. I would like to do extract the same info directly in TimeXtender. 

Could you provide me with guidance how this could be accomplished?


Best regards,



Great to hear that the API is live! This will be a welcome development for our V20 clients that have been hesitant about moving to V21 due to a lack of this type of feature.

Now, when are endpoints coming for the Catalog? Another demand for our clients is to be able to surface the design and lineage information outside of TimeXtender so that it can be shared with their end user community. A set of endpoints to surface that information would be amazing!