15.11 - consolidate_job_logs - Data Stream Architecture

Teradata Data Stream Architecture (DSA) User Guide

prodname
Data Stream Architecture
vrm_release
15.11
created_date
December 2016
category
User Guide
featnum
B035-3150-026K

Purpose

The consolidate_job_logs command uploads all logs for a completed job to a centralized location. If the job is running, the command is rejected.

Syntax

consolidate_job_logs -n|-name Name -I|-job_execution_id Job Execution ID

Example

dsc consolidate_job_logs -n job1 -I 5

Parameters

n|name Name
The name of the job for which you want to retrieve status.
I|job_execution_id Job Execution ID
[Optional] The job ID of the job execution for which you want to retrieve status. You cannot specify a running job. If you do not specify a job execution ID, the latest job execution ID is used.

Usage Notes

The job logs that consolidate_job_logs gathers are placed in a subdirectory, jobName_jobExecutionId_currentTimestamp, that is created under the upload directory defined in the properties file. "It copies all dsc logfiles has last modified timestamp after job start time to newly created directory".

Delete old files in the upload directory periodically, or disk file space will be affected.

XML File Example

This command does not support an XML file.