Purpose
The create command creates a job on the daemon using the syntax parameters and object list. A job definition consists of the parameters and object list.
Syntax
Parameters
See Parameter Order.
- data_streams
- [Optional] Number of data streams to use between the source and target databases. Applies to jobs that use Teradata DSA and TPT API (to and from Teradata). All other protocols use a single data stream.
- db_client_encryption
- [Optional] Set to true if job needs to be encrypted during data transfer.
- dm.rest.endpoint
- [Optional] Enter a Data Mover REST server URL to overwrite the default value specified in the commandline.properties file in order to connect to a different REST server (and therefore a different daemon) at runtime.
- execute_permission
- [Optional] Defines the username[s] and role[s] that have execute permission for the created job.
- force_utility
- [Optional] Forces the Data Mover daemon to use a specific utility for all copy operations.
Valid Values
- dsa
- jdbc
- tptapi
- tptapi_load
- tptapi_stream
- tptapi_update
- T2T
If this value is not specified, the Data Mover daemon determines which Teradata utility is the best to use for the job.
Copying data to an older version of Analytics Database using Teradata DSA is not valid. You cannot use Teradata DSA if the source and target TDPIDs are the same. - freeze_job_steps
- [Optional] Freezes the job steps so that they are not recreated each time the job is started. Only set to true if the source and target environments do not change after the job is created.Valid Values
- true - the job steps are not recreated each time the job is started
- false - the job steps are recreated each time the job is started
- unspecified (default) - the value is set to false
- job_name
- [Optional] Name for this job. The name must be unique and up to 32 characters.
- job_priority
- [Optional] Specifies the execution priority for the job. Supported values are: HIGH, MEDIUM, and LOW, and UNSPECIFIED. If no value is specified, the default of MEDIUM is used at runtime.
- job_security
- [Optional] Defines the access parameters for the created job.
- log_level
- [Optional] Log level for log file output.
Valid Values
- 0
- 1
- 2
- 99
- log_to_event_table
- [Optional] Specifies the event table to be used for this job. For more information, see Using Event Tables.
- max_agents_per_task
- [Optional] Maximum number of Data Mover agents to use in parallel when moving tables or databases.
- netrace
- [Optional] CLI netrace parameter. Any value greater than or equal to 0 generates a CLI trace log. Valid CLI value must be provided.
- netrace_buf_len
- [Optional] CLI netrace_buf_len parameter. Any value greater than or equal to 0 generates a CLI trace log. Valid CLI value must be provided.
- online_archive
- [Optional] Allows read and write access to the source table(s) while the tables are being copied with Teradata DSA. Updates occur to the source table during the copy, but are not transferred to the target table. After a successful copy, the data contained in the target table matches the data that was in the source table at the beginning of the copy.Valid Values
Value Description True Enables online archive False Disables online archive Unspecified Default – the value is set to the value in the Data Mover daemon configuration file - overwrite_existing_objects
- [Optional] Job overwrites objects that already exist on the target.Valid ValuesIf the parameter is not specified, the value is set to the overwrite_existing_objects parameter value in the Data Mover daemon configuration file. If the parameter is specified as true or false, that value takes precedence over the parameter value in the Data Mover daemon configuration file.
Value Description True Enables overwriting False Disables overwriting Unspecified Default – the value is set to the value in the Data Mover daemon configuration file - owner_name
- [Optional] User who created the job.
- read_permission
- [Optional] Defines the username[s] and role[s] that have read permission for the created job.
- response_timeout
- [Optional] Amount of time, in seconds, to wait for response from the Data Mover daemon.
- source_account_id
- [Optional] Logon account ID for source database.
- source_logon_mechanism
- [Optional] Logon mechanism for source system. To log on to a source system, the user must provide at least one of the following:
- source_user and source_password
- source_logon_mechanism
Logon mechanisms are not supported for Teradata DSA jobs. Use logon mechanisms only for Teradata PT API and Teradata JDBC jobs. If -source_logon_mechanism is specified and -force_utility is not used, Teradata PT API is used by default. Specifying -source_logon_mechanism with Teradata DSA specified for -force_utility results in an error.
- source_logon_mechanism_data
- [Optional] Additional parameters needed for the logon mechanism of the source system.
- source_password
- [Optional] Source Teradata logon password.
- source_password_encrypted
- [Optional] Source Teradata encrypted logon password.
- source_sessions
- [Optional] Number of sessions per data stream on the source database.
- source_tdpid
- Source database.
- source_user
- [Optional] Source Teradata logon id.
- source_userid_pool
- [Optional] Job pulls the user from the specified credential pool. Available for any job type. Must use the same credential pool as target_userid_pool if specifying both parameters in the same job definition.
- table
- [Optional] Table to be copied.
- target_account_id
- [Optional] Logon account ID for target database.
- target_logon_mechanism
- [Optional] Logon mechanism for target system. To log on to a target system, the user must provide at least one of the following:
- target_user and target_password
- target_logon_mechanism
Teradata DSA does not support logon mechanisms. Use logon mechanisms only with Teradata PT API and Teradata JDBC jobs. If -target_logon_mechanism is specified and -force_utility is not used, Teradata PT API is used by default. Specifying -target_logon_mechanism with Teradata DSA specified for -force_utility results in an error.
- target_logon_mechanism_data
- [Optional] Additional parameters that are needed for the target system's logon mechanism.
- target_password
- [Optional] Target Teradata logon password.
- target_password_encrypted
- [Optional] Target Teradata encrypted logon password.
- target_sessions
- [Optional] Number of sessions per data stream on the target database.
- target_tdpid
- [Optional] Target database.
- target_user
- [Optional] Target Teradata logon id.
- target_userid_pool
- [Optional] Job pulls the user from the specified credential pool. Available for any job type. Must use the same credential pool as source_userid_pool if specifying both parameters in the same job definition.
- tpt_debug
- [Optional] TPT API trace debug log parameter. Any value greater than or equal to 0 generates a TPT API trace log. Valid TPT API value must be provided.
- write_permission
- [Optional] Defines the username[s] and role[s] that have write permission for the created job.
Usage Notes
Type datamove create -f parameters.xml to create a job. The job name is displayed on the screen when the create command completes. Remember the job name to use in other commands, such as the stop and start commands.
The create command does not start the job. Use the start command to start the job, or the edit command to review the job scripts.
XML File Example
For the create command, type datamove create -f parameters.xml.
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<dmCreate
xmlns="http://schemas.teradata.com/dataMover/v2009"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://schemas.teradata.com/unity/datamover.xsd">
<job_name>floyd_dmdev_create</job_name>
<source_tdpid>floyd</source_tdpid>
<source_user>dmguest</source_user>
<source_password>please</source_password>
<target_tdpid>dmdev</target_tdpid>
<target_user>dmguest</target_user>
<target_password>please</target_password>
<data_streams>5</data_streams>
<source_sessions>1</source_sessions>
<target_sessions>1</target_sessions>
<force_utility>dsa</force_utility>
<log_level>0</log_level>
<db_client_encryption>false</db_client_encryption>
<database selection="unselected">
<name>dmguest</name>
<table selection="included">
<name>test1</name>
<db_client_encryption>true</db_client_encryption>
</table>
<table selection="included">
<name>test2</name>
</table>
<table selection="included">
<name>test3</name>
</table>
</database>
<query_band>Job=payroll;Userid=aa1000000;Jobsession=1122;</query_band>
<job_security>
<owner_name>owner</owner_name>
<read_permission>
<username>read_user1</username>
<username>read_user2</username>
<role>read_role1</role>
<role>read_role2</role>
</read_permission>
<write_permission>
<username>write_user1</username>
<username>write_user2</username>
<role>write_role1</role>
<role>write_role2</role>
</write_permission>
<execute_permission>
<username>execute_user1</username>
<username>execute_user2</username>
<role>execute_role1</role>
<role>execute_role2</role>
</execute_permission>
</job_security>
</dmCreate>