Job Objective
Read data directly from one or more external flat files and write it to a database table.
This job represents a special case of high-speed loading, where the destination table is already populated, or has join indexes or other restrictions that prevent it from being accessed by the Load operator. Because of this, the job includes an intermediate step that loads the data into a staging table and then uses the DDL operator with INSERT…SELECT to move the data into the final destination table.
Data Flow Diagram
The following figure shows a flow diagram of the elements of Job Example 7.
Job Example PTS00013 – Mini-Batch Loading
Sample Script
For the sample script that corresponds to this job, see the following script in the sample/userguide directory:
PTS00013: Mini-Batch Loading into Teradata Database Tables
Rationale
This job uses:
- DDL operator because it can DROP/CREATE staging tables and target tables prior to loading, DROP unneeded tables at the conclusion of the job, and load the production table from the staging table using INSERT…SELECT.
- DataConnector operator because it is the only producer operator that reads data from external flat files and from the Named Pipes access module.
- Load operator because it is the consumer operator that offers the best performance for high speed writing of a large number of rows into an empty database table.