Example: Deploy and load the sklearn model trained outside Vantage - Teradata Package for Python

Teradata® Package for Python User Guide

Deployment
VantageCloud
VantageCore
Edition
VMware
Enterprise
IntelliFlex
Product
Teradata Package for Python
Release Number
20.00
Published
March 2025
ft:locale
en-US
ft:lastEdition
2025-11-06
dita:mapPath
nvi1706202040305.ditamap
dita:ditavalPath
plt1683835213376.ditaval
dita:id
rkb1531260709148
Product Category
Teradata Vantage
  1. Import the lightgbm module and create an LGBMClassifier model.
    >>> from lightgbm import LGBMClassifier
    >>> local_obj = LGBMClassifier(num_leaves=5, objective="binary", n_estimators=10, learning_rate=0.01)
    
    >>> local_obj.fit(pdf_x, pdf_y)
    LGBMClassifier(learning_rate=0.01, n_estimators=10, num_leaves=5, objective='binary')
    
    >>> type(local_obj)
    lightgbm.sklearn.LGBMClassifier
  2. Deploy the trained sklearn LGBMClassifier model in Vantage.
    >>> skl_deploy = td_lightgbm.deploy(model_name="skl_model_trained_outside_vantage", model=local_obj)
    Model is saved.
    >>> type(skl_deploy)
    teradataml.opensource._lightgbm._LighgbmSklearnWrapper
  3. Predict on data residing in Vantage using the loaded model.
    >>> opt_load_outside.predict(data=df_x_classif, label=df_y_classif, pred_contrib=True)
    			col1  			col2				col3  			col4  lgbmclassifier_predict_1
    1.08233576395768  0.846357336044  -0.012062715650015  0.812633063515						 1
    -0.7745167650864  1.038449425697  -0.258906316647375  0.092392283225						 1
    -0.9709790567988  0.290236916664  -0.159962241726072  -0.29883219671						 1
    -1.1673562332519  0.104859696888  -0.152596373567532  -0.45931604928						 1
    -1.4168228235533  -1.10436212447  0.0152120983410702  -1.06231349375						 0
    1.02461098542497  -1.42517183237  0.3508727700607491  -0.14329613097						 0
    0.75954632300467  0.045371444593  0.0808020806248121  0.345420111262						 1
    0.64698519314421  -0.58122848711  0.1696977265096732  0.040145692994						 0
    -1.4719082500819  -0.02919489258  -0.166141412269092  -0.64530912851						 1
    -1.1750967374649  -0.95074511349  0.0182795711560301  -0.89533500387						 1