Hyperparameter Tuning | teradataml - Hyperparameter Tuning in teradataml - Teradata Vantage

Teradata® VantageCloud Lake

Deployment
VantageCloud
Edition
Lake
Product
Teradata Vantage
Published
January 2023
ft:locale
en-US
ft:lastEdition
2024-12-11
dita:mapPath
phg1621910019905.ditamap
dita:ditavalPath
pny1626732985837.ditaval
dita:id
phg1621910019905

A hyperparameter is a parameter whose value is used to control the learning process of machine learning algorithms. The procedure of identifying the right set of hyperparameters for a learning algorithm is known as hyperparameter tuning. The approaches followed to identify optimal hyperparameters from combinations of hyperparameters (hyperparameter space) vary based on search algorithms. The search algorithms are approaches to effectively searching for hyperparameters in the hyperparameter space.

The hyperparameter tuning consist of following key features:
  • Hyperparameter space
  • Search algorithm
  • Data sampling
  • Model training
  • Model evaluation
  • Optimal hyperparameter identification

Hyperparameter tuning in teradataml provides following generic search algorithms:

GridSearch
An approach of hyperparameter tuning that involves training a learning model for all possible set of hyperparameter combinations present in hyperparameter space.
RandomSearch
An approach of hyperparameter tuning that involves training a learning model for a randomly selected set of hyperparameter combinations from the hyperparameter space.

In addition to identifying optimal hyperparameters for learning models (model trainer functions), teradataml extends the hyperparameter search functionality to nonmodel trainer functions such as data cleansing functions and feature engineering functions.

teradataml also offers additional features for hyperparameter tuning, such as parallel execution, early stop, live log, and hyper-parameterization of input data.

Model trainer functions are evaluatable functions that use the complete features of hyperparameter tuning. Likewise, nonmodel trainer functions are nonevaluatable functions that use features of hyperparameter tuning, such as hyperparameterization of function arguments, search algorithms, and model training functionalities.