Evaluating a Trained Model - Teradata Vantage

ClearScape Analytics ModelOps User Guide

Deployment
VantageCloud
VantageCore
Edition
Enterprise
IntelliFlex
VMware
Product
Teradata Vantage
Release Number
7.0
Published
April 2023
Language
English (United States)
Last Update
2023-04-19
dita:mapPath
rfi1654194187578.ditamap
dita:ditavalPath
ayr1485454803741.ditaval
dita:id
B700-1175
Model evaluation is the next step in the model lifecycle when you evaluate a trained model using the test dataset. The test dataset is used to see how well the model performs on data it has not seen earlier.

Model evaluation generates an evaluation report that displays a number of metrics to monitor the model performance. The details of the evaluation report are discussed in the following section.

If the "Enable Model Evaluation" option is not selected for BYOM model, the Model Evaluation step will not be available on the Model Lifecycle page. In that case, you can directly go to the Approval step after Training the model version.
  1. On the Model Version Lifecycle page, select Evaluate.
  2. In the Basic tab, set the following properties:
    7.model-lifecycle_evaluate-sheet-basic-tab
    Property Description
    Model The name of the model in read-only format.
    Dataset Template Specifies the required dataset template.

    For details, see Dataset Templates.

    Dataset Specifies the dataset to be used for evaluation job.

    For details, see Datasets.

  3. In the Advanced tab, set the following properties:
    7.model-lifecycle_evaluate-sheet-advance-tab
    Property Description
    Engine Specifies the engine to evaluate the model.
    Docker Image Specifies the docker image to be used to run evaluation script.
    Resource Template Allows you to use a predefined set of resources, including CPU and memory, which are the properties of the container created to run evaluation script in.

    Select S Standard, M Medium, L Large or Custom from the dropdown list.

    With Custom selection, there are three additional Properties:
    • Memory: Free text to specify memory resource.
    • CPU: Free text to specify container CPU resource.
    • GPU: Specifies the GPU resource for the container.
  4. Select Evaluate Model.
    The model version evaluation progress displays.
  5. Click the
    to close the sheet when the evaluation progress completes.
    The Model Version Lifecycle page displays. The Evaluate step in the header is marked as completed and the model version status is changed to Evaluated.


  6. Select
    to expand the Evaluation Details section to see the details of the evaluation job.

    The following details display:

    Property Description
    Job ID Specifies the evaluation job ID.

    You can select View Job Details to see the event details of the job. For details, see Jobs.

    Date Specifies the evaluation date.
    User Specifies the username who executed the evaluation job.
    Status Shows the status of the evaluation job as Completed
    Dataset ID Displays the testing dataset ID used to evaluate the job.

    You can select View Dataset Statistics and View Dataset to see the dataset details. For details, see Datasets.

    Dataset Name Displays the testing dataset name used to evaluate the job.
    Resources Specifies the resources utilized in the evaluation job including CPU and Memory.
    Job Progress Lists down all phases of the evaluation job. The job progress information includes:
    • Status: Status of each phase as Created, Scheduled, Running, Evaluated, Completed
    • Start Date: Start date and time for each phase
    • End Date: End date and time for each phase
    • Duration: Duration of each phase
    Evaluation Artifacts Lets you view and download evaluation artifacts.

    For details, see Viewing and Downloading Model Artifacts.