Hyperparameter Tuning | teradataml - Hyperparameter Tuning in teradataml - Teradata Package for Python

Teradata® Package for Python User Guide

Deployment
VantageCloud
VantageCore
Edition
Enterprise
IntelliFlex
VMware
Product
Teradata Package for Python
Release Number
20.00
Published
December 2024
ft:locale
en-US
ft:lastEdition
2025-01-23
dita:mapPath
nvi1706202040305.ditamap
dita:ditavalPath
plt1683835213376.ditaval
dita:id
rkb1531260709148
lifecycle
latest
Product Category
Teradata Vantage

A hyperparameter is a parameter whose value is used to control the learning process of machine learning algorithms. The procedure of identifying the right set of hyperparameters for a learning algorithm is known as hyperparameter tuning. The approaches followed to identify optimal hyperparameters from combinations of hyperparameters (hyperparameter space) vary based on search algorithms. The search algorithms are approaches to effectively searching for hyperparameters in the hyperparameter space.

The hyperparameter tuning consist of following key features:
  • Hyperparameter space
  • Search algorithm
  • Data sampling
  • Model training
  • Model evaluation
  • Optimal hyperparameter identification

Hyperparameter tuning in teradataml provides following generic search algorithms:

GridSearch
An approach of hyperparameter tuning that involves training a learning model for all possible set of hyperparameter combinations present in hyperparameter space.
RandomSearch
An approach of hyperparameter tuning that involves training a learning model for a randomly selected set of hyperparameter combinations from the hyperparameter space.

In addition to identifying optimal hyperparameters for learning models (model trainer functions), teradataml extends the hyperparameter search functionality to nonmodel trainer functions such as data cleansing functions and feature engineering functions.

teradataml also offers additional features for hyperparameter tuning, such as parallel execution, early stop, live log, and hyper-parameterization of input data.

Model trainer functions are evaluatable functions that use the complete features of hyperparameter tuning. Likewise, nonmodel trainer functions are nonevaluatable functions that use features of hyperparameter tuning, such as hyperparameterization of function arguments, search algorithms, and model training functionalities.