IMPORT - Basic Teradata Query

Basic Teradata Query Reference

Product
Basic Teradata Query
Release Number
15.10
Language
English (United States)
Last Update
2018-10-07
dita:id
B035-2414
lifecycle
previous
Product Category
Teradata Tools and Utilities

Purpose  

Opens mainframe- or workstation-attached system file, of the specified format, to process data for USING modifiers of subsequent SQL statements.

Syntax  

For Mainframe-Attached Systems:

For Workstation-Attached Systems:

where:

 

Syntax Element

 

Specification

|

The default field delimiter character is a broken vertical bar whose ASCII equivalent is 0x7C, TD_EBCDIC equivalent is 0x6A, UTF-8 equivalent is 0x7C and UTF-16BE equivalent is 0x007C.

AXSMOD

This element specifies that data is to be imported using an access module. This option is valid only for DATA and INDICDATA types on workstation-attached systems.

When an import file is opened, BTEQ sets one or more of the following attributes for the Access Module, depending on the session character set used.

  • CHARSET_NAME
  • CHARSET_NUMBER
  • BYTE_ORDER
  • character

    This element is an optional field delimiter character, which must be enclosed in single quotation marks.

    DATA, INDICDATA, or REPORT

    The format of the data to be read by BTEQ.

    If DATA or INDICDATA is specified, BTEQ requires the incoming data to be in FastLoad format, such as that produced by a BTEQ EXPORT DATA or BTEQ EXPORT INDICDATA command, or by FastExport.

    Note: The INDICDATA statement for the IMPORT command cannot be used as the SET INDICATORMODE or the SET INDICDATA command. 

    If the REPORT format is specified, BTEQ requires the incoming data to be in the format produced by a BTEQ EXPORT REPORT command.

    For z/OS, the lines in the import file are truncated to 254 characters by default. The width limit can be increased to 65531 characters using the WIDTH command.

    The REPORTWIDE keyword is also valid. This keyword specifies that the REPORT format for the data should be used, and changes the WIDTH setting to 32765. Since this keyword is deprecated, a SET WIDTH command should be used instead.

    See the WIDTH command “WIDTH” on page 353.

    ddname

    The name of the z/OS JCL DD statement that defines the mainframe-attached system file from which BTEQ reads data.

    Note: Use caution when using the keyword FILE instead of the keyword DDNAME for mainframe-attached systems, as the implementation is for the concept DDNAME, not the concept FILE.

    DEFERCOLS

    The number of Large Data Object (LDO) columns which are defined with AS DEFERRED in the USING clause. This number must not include LDO columns defined with AS DEFERRED BY NAME in the USING clause.

    The columns defined with AS DEFERRED must be listed first in the USING clause.

    The maximum value that can be supplied for the DEFERCOLS option is based on the maximum number of columns that the Teradata Database supports for the USING clause.

    The keyword LOBCOLS is an alias for DEFERCOLS. Both are identical in functionality.

    DEFERLIMITS

     

     

     

     

    For enabling potential performance improvements, when using a Teradata Database version 15.10 or higher, BTEQ can transfer an LDO non-deferred when it's defined with an AS DEFERRED qualifier in the USING clause. The DEFERLIMITS option can be used to adjust limit settings that control how and when BTEQ is to attempt to override deferred transfer for a column. It is up to the user to specify the most optimal limits for best performance.

    This option indicates the limits for sending LDOs to the database via a deferred transfer versus a non-deferred transfer. This option takes two arguments. The first argument defines the limit (in bytes) for an individual LDO. The second argument defines the cumulative limit of all non-deferred LDO bytes for the current row. For an LDO to be sent as a non-deferred transfer it must be within both limits. If either limit is exceeded, a deferred transfer will occur.

    Valid values for both DEFERLIMITS arguments are 0..999900. The second value cannot be less than the first.

    Non-deferred LDO transfers can be turned off completely by using values of "0,0".

    If this option is not supplied, but the DEFERCOLS option is defined, then the following default values will be used for DEFERLIMITS:

    0,0

    when a PACK factor exists which is greater than one. This disables non-deferred LDO transfers so that the average number of packed records is not affected.

    1000,10000

    when a PACK factor is zero or one. For a non-deferred LDO transfer to occur, its size must be 1000 bytes or less, and the total number of bytes for all non-deferred LDO transfers (including itself) must be 10000 bytes or less.

    The DEFERCOLS option is required and must be specified first in the IMPORT command.

    filename

    The name of the workstation-attached system file from which BTEQ reads data.

    If the name of the file includes a comma, semicolon, or space character, enclose the entire file name in either single or double quotation marks.

    Note: File names are case-sensitive on systems running on a UNIX system, and they are not case-sensitive on systems running under Windows.

    Whenever other command options are specified following the file name in the Windows environment, BTEQ requires the file name to be enclosed by either single or double quotation marks.

    The keyword DDNAME is a valid substitute for the keyword FILE for workstation-attached systems. Note when doing so that the implementation is for the FILE concept, not the DDNAME concept.

    'init-string'

    The name of the initiation string used to start the import process.

    modname

    The name of the file that is to receive the imported data.

    RESET

    Terminates the import. The import file is closed.

    SKIP =n

    The number of lines skipped from the beginning of the file before reading data. The value is zero (0) by default. The range of valid numbers is 0 to 2147483647.

    TOTCOLS

    The total number of columns referred to in the request's USING clause. This includes all LDO and non-LDO columns. The TOTCOLS keyword is valid only for INDICDATA formatted import data records containing LDOs which are defined with AS DEFERRED in the USING clause). .

    The maximum value that can be supplied for the DEFERCOLS option is based on the maximum number of columns that the Teradata Database supports for the USING clause.

    VARTEXT

    The record format of variable length character fields, with each field separated by a delimiter character.

    Usage Notes  

    For mainframe-attached systems, BTEQ supports all sequential files with fixed, variable, or undefined length record formats.

    When the record length of data used during an import operation is greater than 252 bytes, BTEQ displays the following message, indicating that the buffer size has been increased:

    Growing buffer to 32000

    The actual buffer size is 32006, where the last six bytes are used for the parcel type flavor (two bytes) and parcel length (four bytes).

    The DATA and INDICDATA keywords specify the format of the data to be sent to a Teradata Database.

    DATA format is used in Record Mode, which does not send explicit null indicator bits. INDICDATA format is used in Indicator Mode, which does send explicit null indicator bits.

    For the character sets KANJISJIS_0S, KANJIEUC_0U, TCHBIG5_1R0, and SDTCHBIG5_3R0, the default is Record (DATA) Mode, not Field Mode.

    Do not use the IMPORT command in a Teradata SQL macro.

    It is not recommended to specify a file that is already opened from a previous EXPORT command. Unexpected results might occur due to records being written and read at the same time.

    Note: It is the user’s responsibility to correctly describe the data records to be imported by way of the SQL request’s USING clauses.

    Note: If a USING clause is used in a multi-statement request, it must start the first statement. Otherwise, BTEQ does not recognize the USING clause and does not send the data parcel with the request, causing a database syntax error.

    Note: By default, a request that includes a USING clause can process up to n records, where n depends on the current value of the REPEAT and PACK factors, and that the import file is open and contains unread records. Refer to the PACK and REPEAT commands in this chapter for more information.

    Note: Use the SHOW CONTROLS command to identify the current record format of the IMPORT command.

    Note: For Unicode sessions, the encoding of import files must match the encoding of the session character set.

    DATA and INDICDATA Keywords

    When using the keywords DATA or INDICDATA, each record must be in the following format:

  • The record begins with a two-byte length field, that contains an unsigned binary integer whose value is the number of bytes in the record (not including the bytes in the length field and not including the byte(s) in the end-of-record field).
  • The record follows with binary-formatted fields that contain the column values of the selected row in record or indicator format.
  • The record ends with an end-of-record (line) indicator, appropriate for the workstation and operating system. For example, for a UNIX system, the indicator is a new line. For Windows, the indicator is a carriage return/line feed pair. In all cases, the end-of-record character, or characters, are in the bit-format appropriate for the workstation. For example, in ASCII for an ASCII workstation.
  • Notice:

    Numeric data fields are corrupted if the data is reimported using a REPORT format export file.

    When using a file originally exported using BTEQ’s EXPORT command as the source for IMPORT command across a different platform type, ensure that the endianness type of both platforms is the same. This can be verified using the SHOW CONTROLS BYTE ORDER command.

    When using the keywords DATA or INDICDATA, each record must consist of binary-formatted fields that contain the column values of the selected row in record or indicator format.

    Note: For both workstation-attached and mainframe-attached systems a BOM is not valid at the beginning of a DATA (or INDICDATA) import file when a Unicode session character set is being used.

    Importing Large Data Objects

    LDOs (Large Data Objects) consist of CLOB, BLOB, XML, and JSON data types. LDOs can be sent from BTEQ to the database either inline, as deferred transfers, or as non-deferred transfers. How they are sent is dependent upon the use of the DEFERCOLS, TOTCOLS, and DEFERLIMITS options.

    The following bulleted sections compare how LDOs can be transferred:

    LDOs for Inline Transfer

  • Are not described with AS DEFERRED or AS DEFERRED BY NAME qualifiers in the SQL request's USING clause.
  • Are sent up front to the database along with the SQL request. Their actual value is stored in the main import data file along with non-LDO column values.
  • Are limited to 64KB due to FastLoad format's 2-byte overall length indicator, less if more than one column is being transferred for each import record. They must still have an 8-byte ( rather than 2-byte ) length indicator present in the import record before their value. Otherwise, an alternate data type which requires a 2-byte length indicator must be used in the USING clause. For example, VARCHAR rather than CLOB.
  • Do not require use of any transfer-related IMPORT command options but can only be provided using DATA or INDICDATA for the FastLoad format option.
  • LDOs for Deferred Transfer

  • Are described with AS DEFERRED or AS DEFERRED BY NAME qualifiers in the SQL request's USING clause.
  • Have their data stored in their own separate LDO files. The names for these files must be placed in the main import data file using a VARCHAR type value. The maximum size for an LDO file name is 1024 bytes. Each LDO gets separately elicited by the database after the SQL request has been received.
  • Are limited to 2GB size.
  • For AS DEFERRED columns, the DEFERCOLS (and possibly TOTCOLS) option need to be used. No transfer-related IMPORT command option need be used for AS DEFERRED BY NAME columns.
  • LDOs for Possible Non-Deferred Transfer

  • Can only be described with an AS DEFERRED qualifier and must be listed first in the SQL request's USING clause.
  • Have their data stored in their own separate LDO files. The names for these files must be placed in the main import data file using a VARCHAR type value. The maximum size for an LDO file name is 1024 bytes. LDOs will be sent up front to the database along with the SQL request, rather than being separately elicited by the database after the SQL request has been received, assuming sufficient space is available in the import request and the DEFERLIMITS values have not been exceeded. This reduces the number of parcels being sent between BTEQ and the database, which may improve performance.
  • The combined size of non-deferred LDOs and non-LDO data is limited to approximately 1MB per import request.
  • The combination of the DEFERCOLS option and the DEFERLIMITS  import setting, or the combination of the DEFERCOLS option and the PACK factor dictates which LDOs will be transferred non-deferred.   The TOTCOLS option must be specified if using the INDICDATA format.
  • A single import can include LDOs which are both defined with AS DEFERRED and AS DEFERRED BY NAME qualifiers. When this is the case, do not include the AS DEFERRED BY NAME columns in the DEFERCOLS value. For example, if an import includes three AS DEFERRED columns and two AS DEFERRED BY NAME columns, use DEFERCOLS=3.

    LDOs defined with an AS DEFERRED qualifier must be listed first within the USING clause. AS DEFERRED BY NAME columns may occur anywhere in the USING clause after the AS DEFERRED columns.

    BTEQ supports deferred and non-deferred LDO imports with the DATA, INDICDATA, and VARTEXT formats, but not the REPORT format. If exporting LDOs, deferred and non-deferred exports are only allowed with the INDICDATA format.

    When using LDOs defined with the AS DEFERRED qualifier, BTEQ replaces VARCHAR file name values with 4-byte integer tokens before submitting the request to the database. When a single-byte file name is supplied, three bytes (two for the length of the VARCHAR value and one for the file name value) will get replaced by four bytes. This replacement means that the overall size of the import data record might become too large to successfully contain all generated tokens.

    For deferred and non-deferred LDO imports using z/OS BTEQ, import data records must contain fully qualified dataset names, which can be sequential data sets or PDS members, such as USERID.CLOB.DATA1 or USERID.CLOB(DATA).

    Non-deferred LDO transfers (via the DEFERLIMITS option) may change the effectiveness of packed records, when the PACK setting is greater than one. Non-deferred LDOs are passed to the database along with packed rows, whereas deferred LDOs are sent at a later time. However, importing small LDOs via non-deferred transfers may offer better performance compared with deferred transfers, as determined by elapsed times. Some testing/tuning may be needed to determine the optimal way to import records containing LDOs. If performance is negatively affected, non-deferred transfers can be completely disabled by entering "DEFERLIMITS=0,0".

    The following table shows which DEFERLIMITS values will be used in relation to the PACK setting:

     

    PACK Setting

    DEFERLIMITS Option
    (default or specified)

    Actual
    Records Packed

    Maximum
    Single LDO Bytes

    Maximum
    Row LDO Bytes

    0 or 1

    Default (on)

    1

    1,000

    10,000

    0 or 1

    User specified

    1

    User defined

    User defined

    2+

    Default (off)

    1..PACK

    0

    0

    2+

    User specified

    1..PACK

    User defined

    User defined

    Using the DEFERLIMITS option does not guarantee that an LDO will be sent as a non-deferred transfer. It depends upon the arguments supplied, the size of the LDO files, and whether the LDO columns are define with AS DEFERRED or AS DEFERRED BY NAME within the USING clause.

    Following is an example showing how this works. Assume 6 CLOBs are listed in a USING clause and 3 rows will be imported. The DEFERLIMITS option is set to "2000,5000". Notice how BTEQ treats each LDO based on the USING clause qualifier used and the LDO size.

     

    When the DEFERLIMITS option is specified, BTEQ will display the number of deferred and non-deferred LDO transfers for each import. The message will look similar to:

    *** Total LDOs: 50, Non-deferred: 18, Deferred: 32 

    Multiple sessions can be used to take advantage of the database's ability to handle requests in parallel.

    BTEQ can assign up to 200 sessions for use. However, each session requires a dedicated file handle which may exceed a system's resources. If BTEQ is not able to open a file, the associated request is aborted. Therefore, the appropriate number of sessions to use should be determined in advance of running LDO import jobs for production environment purposes.

    Notes:

  • The use of Unicode sessions to transfer AS DEFERRED columns is not supported.
  • The Teradata Database currently does not support the JSON keyword within a USING clause. When importing JSON data, define the column as a CLOB type instead, within the USING clause.
  • For z/OS BTEQ, text LDOs which are imported in deferred or non-deferred mode will not contain end-of-record indicators (newline characters).
  • REPORT Keyword

    If the keyword REPORT is used, each record must be in the format generated by the REPORT keyword of the BTEQ EXPORT command, except that certain records must be removed. For instance, title text and titledashes can be deleted manually using an editor, or they can be programmatically ignored by using the SKIP option.

    For workstation-attached systems, each record must end with an end-of-record (line) indicator that is appropriate for the session character set being used and the platform on which BTEQ is running.

    Notice:

    For REPORT format, BTEQ uses Field Mode to return all data values, including numeric data, in character format. Undesirable results might occur with numeric data fields when data is reimported that was exported for REPORT format.

    Note: The REPORT parameter is not supported under the Kanji character sets KANJISJIS_0S or KANJIEUC_0U, or the Chinese character sets TCHBIG5_1R0 or SDTCHBIG5_3R0.

    Note: On workstation-attached systems, a BOM is optional at the beginning of a UTF-8 or
    UTF-16 REPORT import file when a Unicode session character set is being used.

    VARTEXT Keyword

    BTEQ supports VARTEXT records by converting each delimited data item in the input record into a VARCHAR data item. An empty data item causes the data item's corresponding field to be nulled.

    BTEQ interprets the following as empty data items:

  • Two adjacent delimiters
  • A delimiter as the very first character of the record
  • A delimiter as the very last character of the record
  • A character other than the default “|” can be specified as the data item delimiter. The following rules for specifying a valid optional character apply:

  • No control character other than TAB can be a delimiter.
  • Any character that appears in the data cannot be a delimiter.
  • Delimiters can only be a single-character sequence.
  • For Workstation BTEQ, the delimiter can be a multi-byte character for Unicode sessions only.
  • For Mainframe BTEQ, the delimiter must always be a single-byte character.
  • The only acceptable data types for VARTEXT records are VARCHAR, VARBYTE, and LONG VARCHAR. Undesirable results occur if other data types are used.
  • Because VARTEXT files are opened in text mode, unexpected results might occur if VARTEXT data contains embedded control characters such as null (0x00), linefeed (0x0A) or sub (0x1A). Control characters are accepted for DATA and INDICDATA imports.
  • The number of data items in the input record must be equal to the number of fields defined in the USING clause.
  • Each trailing delimiter at the end of an input record represents a null value to be generated by BTEQ. The null must have a corresponding field defined in the USING clause. If there are missing USING fields, the import might fail because the number of indicator bit bytes sent is greater than the number the DBS calculates to be needed according to the USING clause. BTEQ relies on the DBS to detect this problem. The failure, typically, is indicated by an 2673 error with a message stating the source parcel length does not match the data defined. There are two alternatives ways to solve this problem so the expected number of indicator bytes matches what BTEQ sends. The first alternative is to not use a trailing delimiter. The second is to add a dummy last column to the USING clause.
  • Two consecutive delimiter characters specify that the corresponding field should be nulled. If the record starts with a delimiter, the first field is nulled.
  • There should always be one less delimiter than the total number of columns specified in the USING clause.
  • Note: A BOM is optional at the beginning of a UTF-8 or UTF-16 VARTEXT import file when a Unicode session character set is being used.

    Each record must end with an end-of-record (line) indicator, appropriate for the workstation and operating system. For example, for a UNIX system, the indicator must be a line feed character. For Windows, the indicator must be a carriage return/line feed pair. In all cases, the end-of-record character, or characters, must be in the bit-format appropriate for the workstation.

  • If a space character is used as delimiter, no empty trailing fields are valid in the record. If one or more fields are null at the end of a record, and the delimiter character is a space, the format of the record is not preserved. The trailing nulls and blanks are truncated, resulting in a DBS error when the record is inserted into a table.
  • To make trailing spaces part of the data, end the record with a delimiter.
  • When using a Unicode session charset, the delimiter character employed in VARTEXT import Unicode data files must be a Unicode code point equivalent to the delimiter character employed for the associated EBCDIC IMPORT command within the z/OS BTEQ script. There are certain EBCDIC characters that need special consideration to ensure the code points match.
    For example, if EBCDIC has two vertical bar characters: a solid bar (0x4F) and a broken bar (0x6A).
  • The EBCDIC solid bar character maps to 0x008D in UTF-16BE and is called a vertical line.
  • The EBCDIC broken bar character maps to 0x007C in UTF-16BE and is called the broken bar.
  • Use SET REPEATSTOP ON to stop importing data if an error is encountered during processing a VARTEXT record. By default it rejects the record and continues.

    Opening and Closing Import Files

    BTEQ opens the import file in response to an IMPORT command. BTEQ closes the import file whenever one of the following conditions occurs:

  • BTEQ encounters an end-of-file condition.
  • Another IMPORT command is issued.
  • The AXSMOD option was used and a LOGOFF command is issued.
  • An EXIT or QUIT command is issued.
  • BTEQ encounters an error while reading data from the import file.
  • Note: For information on I/O errors and abends, refer to “I/O Errors and Abends” on page 96.

    Example  

    Use the following command to read from a data set defined by NEWDAT, with an expected data format of INDICDATA.

       .IMPORT INDICDATA DDNAME=NEWDAT 

    Example

    Import data records from the same import file can be remapped as needed to accomplish updates to tables that each have differing row layouts. For example:

       .import data file = test.data;    
       using (c1 integer, c2 integer)
       insert into table1 (c1, c2)
       values (:c1, :c2);
       
       using (c1 integer, c2 integer)
       insert into table2 (c1, c3)
       values (:c1, :c2);
       
       using (c1 integer)
       update table3 set c1 = :c1 where c2 = :c1;

    Note that the import file test.data should contain at least three records so that one record is processed by each of the three requests having a USING clause.

    where:

     

    Table

    Definition

    Table1

    'ct table1 (c1 integer, c2 integer)'

    Table2

    'ct table2 (c1 integer, c2 char(1), c3 integer)'

    Table3

    'ct table3 (c1 integer, c2 integer, c3 integer)'

    Example  

       .PACK 3
       .IMPORT data file = sample.data;
       
       using (x integer, y integer)
       insert into table1 (a,b)
       values (:x, :y);
       
       using (x integer, y integer)
       insert into table2 (a,b)
       values (:x, :y);
       

    Assume that the import file sample.data contains 10 records. With the PACK factor set to 3 and the value of the REPEAT command being 1, the number of records processed by each of the two requests is three. The result is a total of six records are processed, and the import file has four records remaining.

    How using the REPEAT command with the PACK clause affects record processing:

       .IMPORT data file = sample.data;
       .REPEAT 2 PACK 3
       
       using (x integer, y integer)
       insert into table1 (a,b)
       values (:x, :y);
       
       using (x integer, y integer)
       insert into table2 (a,b)
       values (:x, :y);

    Because the value of the REPEAT command is 2 and the value of the PACK clause is 3, six records are inserted into table1 and one record is inserted into table2. Note that the REPEAT command applies only to the request that immediately follows it. The import file has three records remaining.

    How using the PACK command and the REPEAT command affects record processing:

       .SET PACK 3
       .IMPORT data file = sample.data;
       .REPEAT 2
       
       using (x integer, y integer)
       insert into table1 (a,b)
       values (:x, :y);
       
       using (x integer, y integer)
       insert into table2 (a,b)
       values (:x, :y);
       

    The PACK command affects both requests containing a USING clause and the REPEAT command affects only the first request. The results are six records are inserted into table1, three records are inserted into table2, and the import file has one remaining record.

       Table Definition:
       table1 ‘ct table1 (c1 integer, c2 integer)’
       table2 ‘ct table2 (c1 integer, c2 integer)’

    Example

    This example reads data from a file named EXPORT.DAT with an expected data format of DATA, through an Access Module (library name is MYMOD). Note that an init-string is not used.

    .IMPORT DATA FILE = 'EXPORT.DAT' AXSMOD MYMOD

    Example  

    The following shows both deferred and non-deferred transfers. If the test5.dat import file contains 3 records, then 9 individual LDO files must also exist (3 files per LDO column).

    Since c1 and c2 are always 100 bytes or less, they will always be sent as non-deferred transfers because they will always be less than the specified DEFERLIMITS values. The last column will always be transferred in deferred mode because it is defined with AS DEFERRED BY NAME.

    Notice that the DEFERCOLS value only includes the LDO columns defined with AS DEFERRED, and does not include the one LDO column defined with AS DEFERRED BY NAME.

    .IMPORT DATA DEFERCOLS=2 DEFERLIMITS=500,800 FILE=test5.dat
    .REPEAT *
    USING (c1 CLOB(100) AS DEFERRED,
    c2 BLOB(100) AS DEFERRED,
    c3 INTEGER,
    c4 CHAR(10),
    c5 XML AS DEFERRED BY NAME)
    INSERT INTO table1 VALUES (:c1, :c2, :c3, :c4, :c5);
    .IMPORT RESET

    Example  

    The following shows how the PACK setting can affect the importing of LDOs. Since the import format below is INDICDATA, the TOTCOLS option must be used. Since the DEFERLIMITS option is not specified and the PACK value is greater than one, DEFERLIMITS will be turned off by default, and all LDOs will be sent to the database as deferred transfers.

    .PACK 10 
    .IMPORT INDICDATA DEFERCOLS=3 TOTCOLS=6 FILE=test6.dat
    .REPEAT *
    USING (c1 CLOB(100) AS DEFERRED,
    c2 CLOB(500) AS DEFERRED,
    c3 CLOB(5000) AS DEFERRED,
    c4 SMALLINT,
    c5 CHAR,
    c6 VARCHAR(10))
    INSERT INTO table2 VALUES (:c1, :c2, :c3, :c4, :c5, :c6);
    .IMPORT RESET

    Example  

    With z/OS BTEQ, a DDNAME must be defined for the main import data set, but not for individual LDOs. To import LDOs in deferred or non-deferred mode on z/OS, the main import data set must contain actual data set names for each LDO (defined as VARCHAR types). Since the PACK command is being used, the DEFERLIMITS option is included to ensure non-deferred transfers are considered. LDOs that are 500 bytes or less will be imported non-deferred, while all other LDOs will be imported in deferred mode.

    //BTEQ 			EXEC PGM=BTQMAIN
    //DAT1 DD DSN=USERID.BTEQ.EXPFILE,DISP=SHR
    //SYSPRINT DD DSN=USERID.BTEQ.OUTPUT,DISP=MOD
    //SYSABEND DD SYSOUT=*
    //SYSOUT DD SYSOUT=*
    //SYSTERM DD SYSOUT=*
    //SYSIN DD DATA,DLM=##
    .LOGON xxx/yyy,zzz
    .PACK 10
    .QUIET ON
    .IMPORT INDICDATA DEFERCOLS=5 TOTCOLS=7 DEFERLIMITS=500,2000 DDNAME=DAT1
    .REPEAT *
    USING (
    myclob1 CLOB(300) AS DEFERRED,
    myclob2 CLOB(1000) AS DEFERRED,
    myclob3 CLOB(1000) AS DEFERRED,
    myclob4 CLOB(1000) AS DEFERRED,
    myclob5 CLOB(1000) AS DEFERRED,
    mychar VARCHAR(10),
    myint INTEGER)
    INSERT INTO my_table VALUES (:myclob1, :myclob2, :myclob3,
    :myclob4, :myclob5, :mychar, :myint);
    .IMPORT RESET
    .LOGOFF
    .QUIT
    ##