IMPORT - Basic Teradata Query

Basic Teradata Query Reference

Basic Teradata Query
Release Number
May 2017
English (United States)
Last Update
Product Category
Teradata Tools and Utilities


Opens mainframe- or workstation-attached system file, of the specified format, to process data for USING modifiers of subsequent SQL statements.


The following figures show the syntax for both mainframe-attached and workstation-attached systems.

where the following is true:

The default field delimiter character is a broken vertical bar whose ASCII equivalent is 0x7C, TD_EBCDIC equivalent is 0x6A, UTF-8 equivalent is 0x7C and UTF-16BE equivalent is 0x007C.
This element specifies that data is to be imported using an access module. This option is valid only for DATA and INDICDATA types on workstation-attached systems.

When an import file is opened, BTEQ sets one or more of the following attributes for the Access Module, depending on the session character set used.

This element is an optional field delimiter character, which must be enclosed in single quotation marks.
The format of the data to be read by BTEQ.

If DATA or INDICDATA is specified, BTEQ requires the incoming data to be in FastLoad format, such as that produced by a BTEQ EXPORT DATA or BTEQ EXPORT INDICDATA command, or by FastExport.

The INDICDATA option for the IMPORT command cannot be used as the SET INDICATORMODE or the SET INDICDATA command.

If the REPORT format is specified, BTEQ requires the incoming data to be in the format produced by a BTEQ EXPORT REPORT command.

The REPORTWIDE keyword is also valid. This keyword specifies that the REPORT format for the data should be used, and changes the WIDTH setting to 32765. Since this keyword is deprecated, a SET WIDTH command should be used instead.

The name of the z/OS JCL DD statement that defines the mainframe-attached system file from which BTEQ reads data.
Use caution when using the keyword FILE instead of the keyword DDNAME for mainframe-attached systems, as the implementation is for the concept DDNAME, not the concept FILE.
The number of Large Data Object (LDO) columns which are defined with AS DEFERRED in the USING clause. This number must not include LDO columns defined with AS DEFERRED BY NAME in the USING clause.

The columns defined with AS DEFERRED must be listed first in the USING clause.

The maximum value that can be supplied for the DEFERCOLS option is based on the maximum number of columns that the Teradata Database supports for the USING clause.

The keyword LOBCOLS is an alias for DEFERCOLS. Both are identical in functionality.

For enabling potential performance improvements, when using a Teradata Database version 15.10 or higher, BTEQ can transfer an LDO non-deferred when it's defined with an AS DEFERRED qualifier in the USING clause. The DEFERLIMITS option can be used to adjust limit settings that control how and when BTEQ is to attempt to override deferred transfer for a column. It is up to the user to specify the most optimal limits for best performance.

This option indicates the limits for sending LDOs to the database via a deferred transfer versus a non-deferred transfer. This option takes two arguments. The first argument defines the limit (in bytes) for an individual LDO. The second argument defines the cumulative limit of all non-deferred LDO bytes for the current record. For an LDO to be sent as a non-deferred transfer it must be within both limits. If either limit is exceeded, a deferred transfer will occur.

Valid values for both DEFERLIMITS arguments are 0..7340032. The second value cannot be less than the first. The maximum value is based on the database's request message limit.

Non-deferred LDO transfers can be turned off completely by using values of "0,0".

If this option is not supplied, but the DEFERCOLS option is defined, then the following default values will be used for DEFERLIMITS:
  • 0,0 – when a PACK factor exists which is greater than one. This disables non-deferred LDO transfers so that the average number of packed records is not affected.
  • 1000,10000 – when a PACK factor is zero or one. For a non-deferred LDO transfer to occur, its size must be 1000 bytes or less, and the total number of bytes for all non-deferred LDO transfers (including itself) must be 10000 bytes or less in a given record.

The DEFERCOLS option is required and must be specified first in the IMPORT command.

The name of the workstation-attached system file from which BTEQ reads data.

If the name of the file includes a comma, semicolon, or space character, enclose the entire file name in either single or double quotation marks.

File names are case-sensitive on systems running on a UNIX system, and they are not case-sensitive on systems running under Windows.

Whenever other command options are specified following the file name in the Windows environment, BTEQ requires the file name to be enclosed by either single or double quotation marks.

The keyword DDNAME is a valid substitute for the keyword FILE for workstation-attached systems. Note when doing so that the implementation is for the FILE concept, not the DDNAME concept.

The name of the initiation string used to start the import process.
When used, BTEQ will emit messages that reflect which import file records were accepted and which were rejected during the import. The record number included in each message is a 1-based offset that represents its sequential position within the import file. These messages may therefore be useful for determining the location of rejected records which need follow-up actions.

When generation of record messages has been requested for an import and an SQL statement associated with a USING clause is successful, BTEQ will generate an "accepted" format record number message. For example...

*** Insert completed. One row added.
*** Accepted Import Record# 4

But if the statement is instead unsuccessful, BTEQ will generate a "rejected" format record number message. For example...

*** Statement Error 2802 Duplicate row error in MyUser.MyTable.
               Statement# 1, Info =0 
*** Rejected Import Record# 1 

If the statement was unsuccessful due to a request-level failure or error and more than one import record had been packed for the request's USING data and all records must be rejected, then each packed record will subsequently have its own record number message generated. For example...

*** Failure 2673 The source parcel length does not match data that was defined.
               Statement# 3, Info =4
*** Rejected Import Record# 6
*** Warning: All packed records for the request were rejected.
             The other record number message(s) follow.
*** Rejected Import Record# 4
*** Rejected Import Record# 5
The access module library name. If the library cannot be found or an incorrect version is being loaded, use a fully qualified path name.
For example: /opt/teradata/client/xx.xx/lib/ .
Defines the record size limit for the current import. This overrides the current import-related RECORDLENGTH command setting, for the life of the import.

Valid arguments for the RECORDLENGTH option are:

  • MAX64 – Limits record sizes to 64,768 bytes. For DATA/INDICDATA imports, it is assumed each record within the import file begins with a 2-byte record length.
  • MAX1MB – Limits record sizes to 1,024,000 bytes. For DATA/INDICDATA imports, it is assumed each record within the import file begins with a 4-byte record length.

A DATA/INDICDATA import file must contain record lengths of the same size. Mixing records with 2-byte lengths and records with 4-byte lengths will result in unexpected behavior.

This option does not affect REPORT/VARTEXT imports, since they do not involve record lengths.

When exported data is subsequently imported, the RECORDLENGTH option settings must match, otherwise the import will fail.

If this option is not specified, BTEQ will take the value from the current import-related RECORDLENGTH command setting. If the RECORDLENGTH command was not previously submitted, then a default value of MAX64 will be used.

Submit the SHOW CONTROLS IMPORT command to see if a RECORDLENGTH override is in effect for the current import. Otherwise submit the SHOW CONTROLS RECORDLENGTH command to see the parent settings

Terminates the import. The import file is closed.
The number of lines skipped from the beginning of the file before reading data. The value is zero (0) by default. The range of valid numbers is 0 to 2147483647.
The total number of columns referred to in the request's USING clause. This includes all LDO and non-LDO columns. The TOTCOLS keyword is valid only for INDICDATA formatted import data records containing LDOs which are defined with AS DEFERRED in the USING clause). .

The maximum value that can be supplied for the DEFERCOLS option is based on the maximum number of columns that the Teradata Database supports for the USING clause.

The record format of variable length character fields, with each field separated by a delimiter character.

Usage Notes

For mainframe-attached systems, BTEQ supports all sequential files with fixed, variable, or undefined length record formats.

When the record length of data used during an import operation is greater than 252 bytes, BTEQ displays a message similar to the following, indicating that the buffer size has been increased:

Growing buffer to 32000

The DATA and INDICDATA keywords specify the format of the data to be sent to a Teradata Database.

DATA format is used in Record Mode, which does not send explicit null indicator bits. INDICDATA format is used in Indicator Mode, which does send explicit null indicator bits.

For the character sets KANJISJIS_0S, KANJIEUC_0U, TCHBIG5_1R0, and SDTCHBIG5_3R0, the default is Record (DATA) Mode, not Field Mode.

Do not use the IMPORT command in a Teradata SQL macro.

It is not recommended to specify a file that is already opened from a previous EXPORT command. Unexpected results might occur due to records being written and read at the same time.

It is the user’s responsibility to correctly describe the data records to be imported by way of the SQL request’s USING clauses.
If a USING clause is used in a multi-statement request, it must start the first statement. Otherwise, BTEQ does not recognize the USING clause and does not send the data parcel with the request, causing a database syntax error.
By default, a request that includes a USING clause can process up to n records, where n depends on the current value of the REPEAT and PACK factors, and that the import file is open and contains unread records. Refer to the PACK and REPEAT commands in this chapter for more information.
Use the SHOW CONTROLS command to identify the current record format of the IMPORT command.
For Unicode sessions, the encoding of import files must match the encoding of the session character set.

A single-statement SQL request gets iterated for as many times as the number of USING data import records that were packed up with the request. Therefore, when record number messages get generated for iterated requests, each resulting statement's message will reflect consumption of a different record number. But when record number messages are to be generated for a multi-statement request, the record number messages generated for the set of statements will all reflect consumption of the same import record. If the active REPEAT factor is greater than 1 and there are still import records to consume, the request will get resubmitted with the next USING data import record. The subsequent statements' messages will again all reflect the same record number being consumed. But it will be a different record from what the prior request's messages reflected.

The format and wording of the record number messages will not be subject to change as its understood that the text may be used as part of post-processing actions.

The maximum value for a record number is 18446744073709551615. After that number has been used to track a record, BTEQ will start over using the number "1". A warning will be produced whenever a rollover occurs.