create (Teradata Database) - Teradata Data Mover

Teradata Data Mover User Guide

Product
Teradata Data Mover
Release Number
16.10
Published
June 2017
Language
English (United States)
Last Update
2018-03-29
dita:mapPath
kmo1482331935137.ditamap
dita:ditavalPath
ft:empty
dita:id
B035-4101
lifecycle
previous
Product Category
Analytical Ecosystem

Purpose

The create command creates a job on the daemon from the syntax parameters and the object list of databases and tables from the query command. A job definition consists of the parameters and object list.

Parameters

See Parameter Order.

additional_arc_parameters
[Optional] Specifies the additional ARC parameters that will be appended when executing each ARC task. There is a 2048-character limit.
VBMASK=15
broker.port
[Optional] You may enter a broker port to overwrite the default value specified in the commandline.properties file in order to connect to a different ActiveMQ server (and therefore a different daemon) at runtime.
61616
broker.url
[Optional] You may enter a broker URL to overwrite the default value specified in the commandline.properties file in order to connect to a different ActiveMQ server (and therefore a different daemon) at runtime.
dm-server1
data_streams
[Optional] Number of data streams to use between the source and target databases. Applies to jobs that use Teradata ARC, jobs that use TPT API, and Aster (to and from Teradata) jobs. All other protocols use a single data stream.
4
The default value is dynamically calculated by Data Mover.
execute_permission
[Optional] Defines the username[s] and role[s] that have execute permission for the created job.
force_utility
[Optional] Forces the Data Mover daemon to use a specific utility for all copy operations.

Valid Values

  • arc
  • jdbc
  • tptapi
  • tptapi_load
  • tptapi_stream
  • tptapi_update
  • T2T
If this value is not specified, the Data Mover daemon determines which Teradata utility is the best to use for the job.
Copying data to an older version of the Teradata Database using Teradata ARC is not valid. You cannot use Teradata ARC if the source and target TDPIDs are the same .
arc
freeze_job_steps
[Optional] Freezes the job steps so that they are not recreated each time the job is started. Should only be set to true if the source and target environments do not change after the job is created.
Valid Values
  • true - the job steps are not recreated each time the job is started
  • false - the job steps are recreated each time the job is started
  • unspecified (default) - the value is set to false
true
job_name
[Optional] Name for this job. The name must be unique and up to 32 characters.
If you do not specify a name, it is automatically generated using the format: <source tdpid >_<target tdpid >_<date time year>.
job_priority
[Optional] Specifies the execution priority for the job. Supported values are: HIGH, MEDIUM, and LOW, and UNSPECIFIED. If no value is specified, the default of MEDIUM is used at runtime.
MEDIUM
job_security
[Optional] Defines the access parameters for the created job.
log_level
[Optional] Log level for log file output.

Valid Values

  • 0
  • 1
  • 2
  • 99
2
The default value is 0.
log_to_event_table
[Optional] Specifies the event table to be used for this job. For more information, see About Using Event Tables.
max_agents_per_task
[Optional] Maximum number of Data Mover agents to use in parallel when moving tables, databases, or journals.
4
The default value is dynamically calculated by Data Mover.
netrace
[Optional] CLI netrace parameter. Any value greater than or equal to 0 generates a CLI trace log. Valid CLI value must be provided.
netrace_buf_len
[Optional] CLI netrace_buf_len parameter. Any value greater than or equal to 0 generates a CLI trace log. Valid CLI value must be provided.
online_archive
[Optional] Allows read and write access to the source table(s) while the tables are being copied with Teradata ARC. Updates occur to the source table during the copy, but are not transferred to the target table. After a successful copy, the data contained in the target table matches the data that was in the source table at the beginning of the copy.
Valid Values
  • true - enables online archive
  • false - disables online archive
  • unspecified (default) - the value is set to the value in the Data Mover daemon configuration file
true
overwrite_existing_objects
[Optional] Job overwrites objects that already exist on the target.
Valid Values
  • true - enables overwriting
  • false - disables overwriting
  • unspecified (default)
If the parameter is not specified, the value is set to the overwrite_existing_objects parameter value in the Data Mover daemon configuration file. If the parameter is specified as true or false, that value takes precedence over the parameter value in the Data Mover daemon configuration file.
true
owner_name
[Optional] User who created the job.
owner
Set if the daemon security is off, or the user is the super user (dmcl_admin); otherwise, the actual user logged in will overwrite the value.
read_permission
[Optional] Defines the username[s] and role[s] that have read permission for the created job.
response_timeout
[Optional] Amount of time, in seconds, to wait for response from the Data Mover daemon.
60
source_account_id
[Optional] Logon account ID for source database.
Spaces in the account name for the source or target account id causes the job to fail.
source_logon_mechanism
[Optional] Logon mechanism for source system. To log on to a source Teradata Database system, the user must provide at least one of the following:
  • source_user and source_password
  • source_logon_mechanism

Logon mechanisms are not supported for Teradata ARC jobs. Use logon mechanisms only for Teradata PT API and Teradata JDBC jobs. If -source_logon_mechanism is specified and -force_utility is not used, Teradata PT API is used by default. Specifying -source_logon_mechanism with Teradata ARC specified for -force_utility results in an error.

KRB5
source_logon_mechanism_data
[Optional] Additional parameters that are needed for the source system's logon mechanism.
joe@domain1@@mypassword
source_password
[Optional] Source Teradata logon password.
123456789
Not a valid parameter if -source_password_encrypted is also specified. If you do not specify a password for this parameter, the command prompts you to enter it interactively. Input is masked with a set number of asterisks, regardless of the length of the password.
source_password_encrypted
[Optional] Source Teradata encrypted logon password.
17894cc84b5637a88e36fa37a010e3662d18f64b8ce204bef8d63868ad417810
Not a valid parameter if -source_password is also specified.
source_sessions
[Optional] Number of sessions per data stream on the source database.
4
The default value is dynamically calculated by Data Mover.
source_tdpid
Source Teradata Database.
Checks
source_user
[Optional] Source Teradata logon id.
TD_API_user
If you do not specify a logon id for this parameter, the command prompts you to enter it interactively.
Spaces in the user name for the source or target id causes the job to fail.
source_userid_pool
[Optional] Job pulls the user from the specified credential pool. Available for any job type. Must use the same credential pool as target_userid_pool if specifying both parameters in the same job definition.
POOL-1
table
[Optional] Table to be copied.
DB1.TABLE
target_account_id
[Optional] Logon account ID for target database.
Spaces in the account name for the source or target account id causes the job to fail.
target_logon_mechanism
[Optional] Logon mechanism for target system. To log on to a target Teradata Database system, the user must provide at least one of the following:
  • target_user and target_password
  • target_logon_mechanism

Logon mechanisms are not supported for Teradata ARC jobs. Use logon mechanisms only for Teradata PT API and Teradata JDBC jobs. If -target_logon_mechanism is specified and -force_utility is not used, Teradata PT API is used by default. Specifying -target_logon_mechanism with Teradata ARC specified for -force_utility results in an error.

KRB5
target_logon_mechanism_data
[Optional] Additional parameters that are needed for the target system's logon mechanism.
my@domain2@@mypassword
target_password
[Optional] Target Teradata logon password.
212133344
Not a valid parameter if -target_password_encrypted is also specified. If you do not specify a password for this parameter, the command prompts you to enter it interactively. Input is masked with a set number of asterisks, regardless of the length of the password.
target_password_encrypted
[Optional] Target Teradata encrypted logon password.
30e458fce484cefef07724653f5046095208f69fcfbf76bf7290b8576192c2fe
Not a valid parameter if -target_password is also specified.
target_sessions
[Optional] Number of sessions per data stream on the target database.
4
The default value is dynamically calculated by Data Mover.
target_tdpid
[Optional] Target Teradata Database.
Leo
target_user
[Optional] Target Teradata logon id.
TD_tar_User
If you do not specify a logon id for this parameter, the command prompts you to enter it interactively.
Spaces in the user name for the source or target id causes the job to fail.
target_userid_pool
[Optional] Job pulls the user from the specified credential pool. Available for any job type. Must use the same credential pool as source_userid_pool if specifying both parameters in the same job definition.
POOL-1
tpt_debug
[Optional] TPT API trace debug log parameter. Any value greater than or equal to 0 generates a TPT API trace log. Valid TPT API value must be provided.
write_permission
[Optional] Defines the username[s] and role[s] that have write permission for the created job.

Usage Notes

To create a job using the object list that was created with the query command, type datamove create -f parameters.xml. The job name is displayed on the screen when the create command completes. Remember the job name to use in other commands, such as the stop and start commands.

The create command does not start the job. Use the start command to start the job, or the edit command to review the job scripts.

XML File Example

For the create command, type datamove create -f parameters.xml.

The following example shows a parameters file for the create command.
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>

<dmCreate 
    xmlns="http://schemas.teradata.com/dataMover/v2009"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
    xsi:schemaLocation="http://schemas.teradata.com/unity/datamover.xsd">
    <job_name>floyd_dmdev_create</job_name>
    <source_tdpid>floyd</source_tdpid>
    <source_user>dmguest</source_user>
    <source_password>please</source_password>
    <target_tdpid>dmdev</target_tdpid>
    <target_user>dmguest</target_user>
    <target_password>please</target_password>
    <data_streams>5</data_streams>
    <source_sessions>1</source_sessions>
    <target_sessions>1</target_sessions>
    <force_utility>arc</force_utility>
    <log_level>0</log_level>
    <database selection="unselected">
        <name>dmguest</name>
        <table selection="included">
            <name>test1</name>
        </table>
        <table selection="included">
            <name>test2</name> </table> 
        <table selection="included">
            <name>test3</name>
        </table>
    </database>
    <query_band>Job=payroll;Userid=aa1000000;Jobsession=1122;</query_band> 
    <job_security>
         <owner_name>owner</owner_name>
    <read_permission>
        <username>read_user1</username>
        <username>read_user2</username>
        <role>read_role1</role>
        <role>read_role2</role>
    </read_permission> 
    <write_permission>
        <username>write_user1</username>
        <username>write_user2</username>
        <role>write_role1</role>
        <role>write_role2</role>
    </write_permission>
    <execute_permission>
        <username>execute_user1</username>
        <username>execute_user2</username>
        <role>execute_role1</role>
        <role>execute_role2</role>
    </execute_permission> 
    </job_security>
</dmCreate>