Receive Metrics Jobs

Receive Metrics Jobs

#340410

Description

The Receive Metrics Jobs resource is part of a series of data management tools using the Data Ingestion API framework for users on SaaS deployments to transfer utilization data into Densify.

The Data Ingestion API framework allows you to customize and extend data transfer capabilities (such as transferring metrics, configuration, attributes, or commands) for Densify SaaS deployments in a secure job execution framework. The framework encapsulates a custom data management tool into a Job entity in Densify, where you can create, list, schedule, execute, and delete. When you schedule the Densify job for execution, the custom data management tool is invoked. The framework also allows you to upload and download job artifacts, such as data or log files.

By default, Densify is installed with the Metrics data management tool which allows SaaS users to invoke a standard Receive Metrics Jobs resource from the Densify REST API. This resource allows you to upload a metrics file into a Densify job and schedule the job for processing, which transfers the metrics data to their respective services for analysis. You can also use the Receive Metrics Jobs resource to download any result files or logs of the job execution.

Resource

/receive/metrics/jobs/

Supported Operations

Table: Receive Metrics Jobs Supported Operations

Operation

HTTP Method

Request Input Parameters

Response Elements

Description

Create a job

POST /receive/metrics/jobs

Request Body Parameters:

Creates a job in Densify with the support structure for uploading utilization metrics.

If the [name] parameter is not supplied, then the job's name will be set to the job GUID.

See Example: Create a Job.

Get all jobs

GET /receive/metrics/jobs

None

Collection of jobs:

Returns a list of all jobs in Densify.

See Example: Get All Jobs.

Get details of an individual job

GET /receive/metrics/jobs/<jobId>?[lines_to_tail_in_logs=n]

Path Parameter:

Query String Parameter:

Returns the details of the job specified by job ID provided in the request.

See Example: Get Specific Job Details.

Get job input file details

GET /receive/metrics/jobs/<jobId>/input

Path Parameter:

Returns a list of input files for the job.

See Example: Get Job Input File Details.

Get job log file details

GET /receive/metrics/jobs/<jobId>/logs?[lines_to_tail_in_logs=n]

Path Parameter:

Query String Parameter:

Returns a list of log files for the job.

See Example: Get Job Log File Details.

Get job audit details

GET /receive/metrics/jobs/<jobId>/audit_info

Path Parameter:

Returns audit information for the job specified.

See Example: Get Job Audit Information.

Upload a file to a job

POST /receive/metrics/jobs/<jobId>?[execute=true]&[time=HH:MM]

Path Parameter:

Query String Parameters:

Request Body Parameter:

Uploads a file into an existing job.

See Example: Upload a File.

Download files in a job

GET /receive/metrics/jobs/<jobId>/download/files?[file]

Path Parameter:

Query String Parameters:

  • octet-stream zipped file containing the downloaded files

See Download Files.

This request returns all files designated for download of the specified job. You also have the option to download a specific file, if the file name is known.

See Example: Download Files.

Download logs in a job

GET /receive/metrics/jobs/<jobId>/logs/files?[file]

Path Parameter:

Query String Parameters:

  • octet-stream zipped file containing log files

See Download Files.

This request returns all log files of the specified job. You also have the option to download a specific log file if the filename is known.

See Example: Download Logs.

Delete job input files

DELETE /receive/metrics/jobs/<jobId>/contents/input

Path Parameter:

Delete all input files associated with the specified job.

See Example: Delete Input Files.

Delete job log files

DELETE /receive/metrics/jobs/<jobId>/contents/logs

Path Parameter:

Delete all log files associated with the specified job.

Use this request to clean up the job log files on the Densify server.

See Example: Delete Log Files.

Delete a job

DELETE /receive/metrics/jobs/<jobId>

Path Parameter:

Deletes all content associated with the job (i.e. input, download, and log files) and removes the job from the scheduled job list.

See Example: Delete Job.

Update parameters of an existing job

PUT /receive/metrics/jobs/<jobId>/parameters

Path Parameter:

Request Body Parameters:

Updates the parameter attributes of an existing job.

See Example: Update Job Parameters.

Parameters

The following is a complete list of possible parameters for the /receive/metrics/jobs endpoint. Path, query string, and/or request body parameters are required depending on the method requested.

Path Parameters

Table: Receive Metrics Jobs - Path Parameters

Parameter Name

Type

Description

jobId

string

Specify the job GUID to identify the job.

Request Body Parameters

Table: Receive Metrics Jobs - Request Body Parameters

Parameter Name

Type

Description

name

(optional)

string

The name of the job.

During job creation, if the name parameter is not set, then name will be automatically set to the job globally unique identifier (GUID).

Example of setting the job name:

   {"name": "my-sample-job"}

If you do not want to specify any parameters for creating a job, you must still provide an empty JSON body element when using the POST operation:

   POST /receive/metrics/jobs

   { }

parameters

(optional, depending on the metrics custom endpoint)

Array of name-value pairs

The job parameters element is an array of "name", "value" pairs that you can provide in the request body.

The "parameters" element needs to be provided in the following JSON format:

"parameters":

[

   {"name": <string>,

    "value": <string>},

    ...

 ]

The parameters required during job creation is dependent on the metrics custom endpoint used. Below is an example of setting optional parameters during job creation:

{

  "name" : "My new job",

  "parameters":

  [

   {

     "name": "JobPriority",

     "value": "urgent"

   },

   {

     "name": "Licence",

     "value": "true"

   },

   {

     "name": "Area",

     "value": "NewArea"

   }

  ]

}

In an update parameters request, you need to provide the complete list of parameters, including the updated ones and the non-updated ones. The new list of parameters used for the update command overwrites the entire old list of parameters.

Below is an example of the request body for updating the "Area" parameter. Notice that the entire parameters list is provided.

  [

   {

     "name": "JobPriority",

     "value": "urgent"

   },

   {

     "name": "Licence",

     "value": "true"

   },

   {

     "name": "Area",

     "value": "west-2b"

   }

  ]

file

multipart/form-data

To upload a file into an existing job, attach the file into the file form-data key in the body of the POST request.

Query String Parameters

Table: Receive Metrics Jobs - Query String Parameters

Parameter Name

Type

Description

lines_to_tail_in_logs

(optional)

integer

Specify the number of lines from the bottom of the log files to display.

For example, to display the last 100 lines of logs, you would specify the following:

  lines_to_tail_in_logs=100

The default value of -1 denotes that the entire log files will be displayed.

execute

(optional)

string

The job execute option specifies whether the job should be executed with the uploaded file.

Possible values for the execute option:

  • true—The job is executed at the scheduled time provided by the time parameter. If time is not set and you specify execute=true, then the job is executed immediately.
  • false—The file is uploaded with no job execution. This is the default behavior if no execute option is specified. You can use the execute=false option to upload multiple files to the job before executing the bulk job.

time

(optional)

string

The job time option is used in conjunction with the execute option to specify when the job is to be executed next.

The time value must be in HH:mm 24-hr format.

file

(optional)

string

The file option allows you to download a specific file.

You must specify the exact download filename in order for this operation to succeed. For example, to download a file named "output.txt", the following call is made:

GET /receive/metrics/jobs/455fa7bb-10fb-41a7-96a9-f4b13bd7a05c/download/files?file=output.txt

Here is an example to download a log file named "output.log":

GET /receive/metrics/jobs/455fa7bb-10fb-41a7-96a9-f4b13bd7a05c/logs/files?file=output.log

Response

The following is a complete list of possible response elements that are returned for the /receive/metrics/jobs resource. If the response element does not apply to the API request, then the element is not displayed in the results.

Table: Receive Metrics Jobs Response Schema

Element

Type

Description

jobId

string

The globally unique identifier (GUID) assigned to the job.

name

string

The name of the job.

During job creation, if the [name] parameter is not set, then it will be automatically set to the job globally unique identifier (GUID).

parameters

Array of

  • name
  • value

The job parameters element is an array of "name", "value" pairs that is dependent on the data management tool used. The Metrics tool is the default data management tool for the Receive Metrics Jobs resource.

audit_info

  • audit name
  • audit date
  • audit path
  • target audit
  • target failed
  • audit end date
  • load date
  • load end date
  • load status

If the job has uploaded metric files which have been audited, the last audit details are displayed.

The audit_info element displays the following information:

"audit_info": {

    "audit_name": <string>,

    "audit_date": <string>,

    "audit_path": <string>,

    "target_audit": <int>,

    "targets_failed": <int>,

    "audit_end_date": <string>,

    "load_date": <string>,

    "load_end_date": <string>,

    "load_status": <string>,

}

input_files

Array of:

  • name
  • size

If the job has uploaded files, details of those files are displayed:

"input_files": [

    {

    "name": <string>,

    // name of the metrics file

    "size": <int>,

    // size of the file, in bytes

    },

    ...

    {

    "name": <string>,

    "size": <int>,

    }

]

lines_to_tail_in_logs

integer

Specifies the number of lines from the bottom of the log file to display.

The default value of -1 denotes that the entire log file will be displayed.

logs

  • name
  • contents

Contains the details of each file in the job's log directory. For each log file available, the following information is displayed:

  • name—displays the log filename;
  • contents—displays the tail end contents of the log file, depending on the lines_to_tail_in_logs parameter.

job_status

  • code
  • message
  • files

An element that provides the status of the executed Metrics job operation, which contains the following items:

  • code—Displays the status code from the statuscode.txt file in the job status folder. If the statuscode.txt file does not exist, then the last audit load code is displayed:
    • 0—successful operation;
    • -1—an error with loading the job.
  • message—Displays the contents of the statusmessage.txt file in the job status folder. If the statusmessage.txt file does not exist, then the last audit load status message is displayed (e.g. Loaded, Loaded_Error).
  • files—Displays the details of each non-status file from the job status folder.

The job_status element is displayed in the following form:

{"code": <int>,

"message": <string>,

"files": [

  {

    "name": <string>, // filename

    "size": <int>, /size in bytes

  },

   ...

  ]

}

Below is a response example if there are additional non-status files in the job status folder:

"job_status": {

    "code": 0,

    "message": "File Loaded Successfully.",

    "files": [

       {

       "name": "production_status.zip",

       "size": 908756

       },

       {

       "name": "test_status.zip",

       "size": 453903

       }

    ]

}

The default Metrics data management tool produces a statusmessage.txt file, a corresponding statuscode.txt file, and other status files in the job status directory for each job execution. The Data Ingestion API framework provides flexibility for a custom data management tool to produce their own status message, code, and files by generating files with the exact name (i.e. statusmessage.txt, statuscode.txt) in the custom tool's job status directory.

Note: File names are case sensitive. Both message and code files must exist in the job status directory for the framework to override the job_status element.

Download Files Response

Download Files

octet-stream zipped file containing files

The response for a file or log download request is an octet-stream zipped file containing all the files produced when executing the job. For a single file download, the zipped file will contain only one file. The suggested filename (in the response Content-Disposition) is in the following format:

  • for download logs request:
  • {jobId}_logs.zip

  • for download file request:
  • {jobId}_download.zip

Where {jobId} is the job GUID.

Messages and Error Handling

message

string

A response message for the request from the job_status element.

The job_status: message string can be one of the following:

  • AccessDeniedException: "Access Denied! Unable to create temporary file or directory"—Check file permissions on file system.
  • IngestionJobDirectoryDoesNotExistException—Job file structure does not exist in the system.
  • IngestionJobDirectoryNotReadbleException—Job file structure permission is possibly set to read-only.
  • IngestionJobInvalidArgumentException—Invalid arguments in the file upload request.
  • IOException: "There is not enough space on the disk. IO Exception"— Check disk space.
  • IOException—Any I/O exception that occurred on the system, such as when the specified filename cannot be found.
  • Exception—Any other error that occurred during the processing of the upload request.

status

integer

The status code of the job request.

Possible "status" values are:

  • 200—job action successful (i.e. delete action was successful);
  • 404—the job does not exist;
  • 409—the job is currently running, request action canceled;
  • 400—the job files are not accessible or permissions are invalid;
  • 500—the job is corrupt or any other exception.

HTTP Code 400

HTTP error

Request is malformed. For example, when the time format is invalid, the following error message is displayed:

{

  "message": "Invalid schedule time format <HH:MM>. Schedule task cannot be updated or created.",

  "status": 400

}

HTTP Code 403

HTTP error

"Limit number of active jobs has been reached."

The number of active jobs on the server exceeds the Densify configuration setting of maximum job limit.

HTTP Code 404

HTTP error

The job does not exist in the system.

HTTP Code 500

HTTP error

  • "Working directory does not exist."—The Data Ingestion API root working directory does not exist.
  • "OS error message."—Failed to create directory structure due to insufficient storage..

Examples

Example: Create a Job

The following example shows you how to create a job:

Example: Get All Jobs

The following example shows you how retrieve all jobs:

Example: Get Specific Job Details

The following example shows you how retrieve details of a specific job:

Example: Get Job Input File Details

The following example shows you how retrieve input file details of a specific job:

Example: Get Job Log File Details

The following example shows you how retrieve log file details for a specific job:

Example: Get Job Audit Information

The following example shows you how retrieve details of a specific job:

Example: Upload a File

The following example shows you how to upload a file to an existing job:

Example: Download Files

The following example shows you how to download files from an existing job:

Example: Download Logs

The following example shows you how to download log files from an existing job:

Example: Delete Input Files

The following example shows you how to delete the input files from an existing job:

Example: Delete Log Files

The following example shows you how to delete log files from an existing job:

Example: Delete Job

The following example shows you how to delete a job:

Example: Update Job Parameters

The following example shows you how to update parameters for an existing job: