Data Moverコマンドライン インターフェースで使用できるすべてのコマンドについて、それらの基本的な構文構造、一覧、および概要を表示できます。
また、コマンド名を指定して、そのコマンドについての詳細情報(構文例、コマンドに関連付けられたパラメータの一覧、各パラメータの説明など)を表示することもできます。
- コマンドラインに「datamove --help」と入力し、以下の操作のいずれかを実行します。
オプション 説明 View all commands(すべてのコマンドを表示する) - Enterを押し、全コマンドの一覧と概要を表示します。
View a specific command(特定のコマンドを表示する) - 具体的なコマンド名を追加することによって、特定のコマンドについての情報(そのパラメータの一覧など)を表示します。
- Enterを押します。
例えば、createコマンドについての情報を表示するには、「datamove --help create」と入力して Enterを押します。次のように表示されます。
Data Mover Command Line 16.20.12.00 Connected to Daemon version 16.20.12.00 NAME: create - Create Command DESCRIPTION: Creates a job on the DM Daemon based on the information from the arguments and the parameter file EXAMPLE: datamove create -job_name job1 -f parameters.xml Parameters: Parameter Example Description job_name job1 (optional)Name for the job, must be unique. Generated if unspecified job_priority MEDIUM (optional)Execution priority for job. HIGH/MEDIUM/LOW. Default is MEDIUM. source_tdpid Checks Source Teradata database source_user TD_API_user (optional)Source Teradata logon id source_password 123456789 (optional)Source Teradata logon password source_logon_mechanism NTLM (optional)Mechanism to use when logging onto source system source_logon_mechanism_data joe@domain1 (optional)Additional parameters for the logon mechanism being used source_account_id account_id (optional)Source Teradata logon account ID target_tdpid Leo Target Teradata database target_user TD_API_user (optional)Target Teradata logon id target_password 123456789 (optional)Target Teradata logon password target_logon_mechanism NTLM (optional)Mechanism to use when logging onto target system target_logon_mechanism_data joe@domain1 (optional)Additional parameters for the logon mechanism being used target_account_id account_id (optional)Target Teradata logon account ID data_streams 4 (optional)Number of data streams to use source_sessions 4 (optional)Number of session on the source target_sessions 4 (optional)Number of session on the target max_agents_per_task 4 (optional)Max number of agents to use in parallel for each table/database/journal being moved overwrite_existing_objects true (optional)Overwrites objects that already exist on the target force_utility arc (optional)Force DM to use a specific utility for all DM operations log_level 2 (optional)Log level to output a log file online_archive true (optional)Allows read/write access to the source tables while the tables are being copied table DB1.TABLE (optional)Table to be copied response_timeout 60 (optional)Amount of time to wait for Daemon response in seconds uowid uowid (optional) Unit of work id to identify the job execution. security_username dmcl_admin (optional) User ID of the super user. Only used if security management is enabled. security_password 53cUr17y (optional) Password for the super user. Only used if security management is enabled. security_password_encrypted 052c7aabd1.. (optional) Encrypted password for the super user. Only used if security management is enabled. query_band AppName=B; (optional) QueryBand passed to the Database as a list of name=value pairs in a string. source_hadoop_webhcat_url http://sdll9119.labs.teradata.com:50111 (required) source hadoop system webcat url. source_hadoop_oozie_url http://tdh021m1.labs.teradata.com:11000 (required) source hadoop system oozie url. source_hadoop_file_system_url http://tdh021m2.labs.teradata.com:50070 (optional) source hadoop system file system url. source_hive_user hive (required) hadoop hive user. source_hive_password hive (required) hive user password. source_hive_password_encrypted (optional) encrypted source hadoop system password. source_hadoop_logon_mechanism kerberos (optional) source hadoop system security logon mechanism. target_hadoop_webhcat_url http://sdll9119.labs.teradata.com:50111 (required) target hadoop system webcat url. target_hadoop_oozie_url http://tdh021m1.labs.teradata.com:11000 (required) target hadoop system oozie url. target_hadoop_file_system_url http://tdh021m2.labs.teradata.com:50070 (optional) target hadoop system file system url. target_hive_user hive (required) target hadoop hive user. target_hive_password hive (optional) target hadoop system user password. target_hive_password_encrypted (optional) encrypted target hadoop system password. target_hadoop_logon_mechanism kerberos (optional) target hadoop system security logon mechanism. hadoop_file_option rc (optional) hadoop file transfer type. hadoop_file_delimiter (optional) hadoop record delimiter. hadoop_transfer_method batch_insert (optional) hadoop transfer method. hadoop_transfer_batch_size 10 (optional) hadoop batch transfer size. hadoop_number_mappers 1 (optional) number of hadoop mappers. source_aster_system_name 10.25.32.100 (required) source aster system name. source_aster_port 2406 (optional) port number for source aster system. source_aster_user_name beehive (required) source aster system user name. source_aster_user_password beehive (optional) source aster system user password. source_aster_user_password_encrypted (optional) source system encrypted user password. target_aster_system_name 10.25.32.100 (required) target aster system name. target_aster_port 2406 (optional) port number for target aster system. target_aster_user_name beehive (required) target aster system user name. target_aster_user_password beehive (optional) target aster system user password. target_aster_user_password_encrypted (optional) target system encrypted user password. aster_query_timeout 60 (optional) aster query timeout duration. aster_preserve_column_case no (optional) aster need to preserve column case. aster_skip_error_records no (optional) aster needs to skip error records. netrace 21 (optional) cli netrace parameter value. netrace_buf_len 0 (optional) cli netrace_buf_len parameter value. tpt_debug 1 (optional) tptapi trace debug parameter value. additional_arc_parameters VBMASK=15 (optional) Specifies the additional ARC parameters that will be appended when executing each ARC task. source_staging_database db1 (optional) source staging database name target_staging_database db1 (optional) target staging database name target_database db1 (optional) target database name db_client_encryption true (optional) set if job need to encrypted during data transfer.