Tasks nodes let you add actions to the workflow. Create new tasks or add predefined tasks to your workflow.
For each task, complete the specified task properties view.
Email Properties | Description |
---|---|
Node Name | Leave the default name or enter a customized name. |
Description | [Optional] Enter a description of the task. |
Email Action | Click to select an email action. |
Subject | [Optional] Enter the subject of email. Subject and Body complete with Email Action details if available. You can edit both.
|
Body | [Optional] Enter the body text of the email. |
Table Validation Properties | Description |
---|---|
Node Name | Leave the default name or enter a customized name. |
Description | [Optional] Enter a description of the task. |
Database | Click to select a table validation. Then select a table validation or click to add a new table validation. See Table Validation Properties. |
Table | Enter the table or leave the table selected with database |
Name | Enter the name of the table validation. Use Ecosystem Configuration portlet to set up. |
Script Execution Properties | Description |
---|---|
Node Name | Leave the default name or enter a customized name. |
Description | [Optional] Enter a description of the task. |
Select New or Existing | Select New to add a new script or Existing to see available script actions that were added with the Ecosystem Configuration portlet. |
Script Action | Enter or select the name of the script. |
Host Name | Enter the host name where the script is located. |
User | Enter the username. Make sure the credentials are associated with the script. |
Password | Enter the password. |
Script | Enter the script name (full path). For example, /opt/sampleScript/sampleExerun_job_xyz.sh |
Script Params | [Optional] Enter any script parameters that are required. These are input parameters that can be passed to the script if needed. |
Log File | [Optional] Enter the full path and filename of a log file to create it. After the script execution, the log file appears with a timestamp. You can download this file. |
Data Mover Properties | Description |
---|---|
Node Name | Leave the default name or enter a customized name. |
Description | [Optional] Enter a description of the task. |
DM Daemon | Select a Data Mover Daemon. This shows a list of DMC servers with DM enabled job control that were added with the Ecosystem Configuration portlet. This server must have the Data Mover REST component correctly configured and running. |
DM Job ID | Select a Data Mover Job ID. This shows a list of DM jobs for the Daemon selected. |
Log File | [Optional] Click the box to create a log file with the results |
Unity Properties | Description |
---|---|
Node Name | Leave the default name or enter a customized name. |
Description | [Optional] Enter a description of the task. |
Operation | Select an operation. Control a Unity server with the following actions: Activate, Deactivate, Demote, Freeze, Halt, Promote, and Recover. These operations let you resolve high availability issues. |
Database | Click to select a database. Only databases set up with Unity-managed tables appear. |
The following information is fills in after you select a database: Table, TDPIDList, Active Sequencer, User, Standby Sequencer, and User.
Hadoop Properties | Description |
---|---|
Node Name | Leave the default name or enter a customized name. |
Description | [Optional] Enter a description of the task. |
HDP Server | Select the Hadoop server. |
Job Type | Automatically filled in with job type Pig. |
User name | Enter the user who has permission to run the task. |
HDFS File | Enter the full path of the the Hadoop Distributed File System including the file name. This can be a Pig Latin script. |
HDFS Status Directory | [Optional] Enter the Hadoop Status Directory. |