15.10 - Evaluating Jobs with Exit Code=0 - Parallel Transporter

Teradata Parallel Transporter User Guide

Parallel Transporter
User Guide

Evaluating Jobs with Exit Code=0

The job logs may contain the following important information, which is of value and may warrant further action.

Review the Metadata

Teradata PT provides two types of metadata.

  • TWB_STATUS private log captures job performance metadata
  • TWB_SRCTGT private log captures source and target metadata

    TWB_STATUS private log captures job performance data at different stages of the job. Teradata PT also provides a tbuild command option for specifying the interval (in seconds) for collecting performance data. For about all tbuild options Teradata Parallel Transporter Reference.

    This information is useful for evaluating the performance of a job in terms of throughput and the cost of exporting and loading of data by each operator. It is also useful for capacity planning by collecting the performance data for a period of time, summarizing the CPU utilization and elapsed time for each job, and then determining the trend of performance for the overall loading and exporting processes for a specific system configuration.


    Here are some tips for performance evaluations and tuning:

  • Determine the difference in CPU utilization between the producer and consumer operators. For example, if the CPU utilization of the producer operator is 2 times greater than that of the consumer operator, increasing the number of producer instances by a factor of 2 might improve the throughput of the job.
  • Determine the difference between the CPU utilization and the elapsed time for performing the exporting and loading of data (i.e. the EXECUTE method). If the elapsed time is much higher than the CPU time, this could mean that some bottlenecks might have occurred either on the network, I/O system, or the Teradata Database server.
  • Find out how many rows were sent by the producer operator (or received by the consumer operator) with the above CPU utilization. Dividing the numbers of rows by the CPU seconds spent on processing these rows would give you the number of rows per CPU second.
  • The difference between the “start time” of two successive methods would indicate how long the job spent on a method.
  • Find out how much time being spent on each checkpoint. Note checkpoint takes time and resources to process. Tuning the number of checkpoints to be taken by changing the checkpoint interval is necessary.

    The source and target data shown in this log is for reference only, and requires no specific usage strategy.

    Review the Warnings

    Check for any minor warnings that may appear in the logs to see if further action is required, as shown in the following examples:

  • The DDL operator may encounter database errors that the ErrorList attribute is set to ignore and will return a warning instead of an error, while allowing the job to continue executing.
  • Action: Review the warnings and associated errors. Determine whether or not ignoring the error is achieving the results you expected. Reset the ErrorList attribute if required.

  • The OS Command operator may not have been able to execute one or more of the commands requested of it.
  • Action: Review the error message output and correct the problems as you would any normal operating system error messages. If the OS Command operator IgnoreError attribute value was set to Yes, then any command errors would not have terminated the job. In these cases, look at the logs for any OS Command operator error messages and if any are present, determine whether or not later job steps were adversely affected by any commands that were not successfully executed.

    Allowed Errors

    When data is being written to the Teradata Database, consumer operators can be set to allow the job to proceed even if some data cannot be loaded, using the ErrorLimit attribute. This attribute applies to the following operators:

  • Load
  • Stream
  • Update
  • Cause:

    There may be various reasons why the data did not load, but it is often due to violations of the schema or data type requirements when the data was originally entered into the source files.

    For more information, refer to the sections on Load, Stream, and Update operator errors in Chapter 11: “Troubleshooting.”

    Corrective Action:

  • Examine the error tables for the operators to determine whether or not they contain any unprocessed data.
  • Determine the reason the data did not load.
  • Consider whether or not to correct the data errors in the source.
  • In most cases, you will need to clean up the bad data and load it into Teradata Database with a separate job.
  • Consider whether or not to reset the ErrorLimit attribute to a lower value.