Model Evaluation Report - Teradata Vantage

ClearScape Analytics™ ModelOps User Guide

Deployment
VantageCloud
VantageCore
Edition
Enterprise
IntelliFlex
VMware
Product
Teradata Vantage
Release Number
7.1
Published
December 2024
ft:locale
en-US
ft:lastEdition
2024-12-13
dita:mapPath
zdn1704469623418.ditamap
dita:ditavalPath
azq1671041405318.ditaval
dita:id
rgn1654191066978
lifecycle
latest
Product Category
ClearScape
ModelOps allows you to evaluate a model and mark a champion model based on its performance. You can view the evaluation report that highlights the model performance in the form of certain metrics and compare models based on the metric values.

Expand the Evaluation details section and select View report.

The model evaluation report displays the following areas:
  • Model Version Details
  • Key Metrics
  • Metrics
  • Performance Charts
  • Actions

Model Version Details

Lists down all the details of the model version, training, and evaluation jobs.

The following details display related to the training job.
Property Description
Model version ID Specifies the model version ID. You can select the Model version ID link to go to the Model Version lifecycle page.
Evaluation job ID Specifies the evaluation job ID. You can select the Job ID link to go to the Job's details.
Evaluation date Specifies the evaluation date.
Dataset ID Displays the training dataset ID used to train the job. You can select the Dataset ID link to see the dataset details.
Dataset name Displays the training dataset name used to train the job.
Hyper parameters Specifies the hyper parameters defined to run the job.

Key Metrics

Displays the key metrics that you mark in the Metrics area. The Metrics area can contain a large list of performance metrics. You can mark some of the metrics as Key Metrics to easily access them. All the key metrics will display in this area.

Metrics

Lists down the performance metrics and their values for the current model version. Use the Mark as Key Metric option to mark the key metrics and they will display in the Key Metrics area.

A list of common performance metrics is:

Metric Description
Accuracy The ratio of the number of correct predictions to the total number of input samples.
Recall The number of correct positive results divided by the number of all relevant samples (all samples that should have been identified as positive).
Precision The number of correct positive results divided by the number of positive results predicted by the classifier.
F1-score The Harmonic Mean between precision and recall. The range for F1 Score is (0,1). It tells you how precise your classifier is (how many instances it classifies correctly), as well as how robust it is (it does not miss a significant number of instances).

Performance Charts

Displays a number of performance charts based on different metrics to help you monitor model performance visually and decide if you want to mark the model as Champion.

Actions

Use the model evaluation report to perform any of the following actions on the current model version.

Action Description
Approve See Approving a Model Version.
Reject See Rejecting a Model Version.
Mark/Unmark as Champion Lets you mark/unmark the model version as Champion based on its performance. For details, see Marking a Model Version as Champion.
View model drift Displays the Model drift page where you can monitor the model performance. For details, see Drift Monitoring.