Example: Single model deployment and loading of td_lightgbm Booster trained in Vantage - Teradata Package for Python

Teradata® Package for Python User Guide

Deployment
VantageCloud
VantageCore
Edition
VMware
Enterprise
IntelliFlex
Product
Teradata Package for Python
Release Number
20.00
Published
March 2025
ft:locale
en-US
ft:lastEdition
2025-11-06
dita:mapPath
nvi1706202040305.ditamap
dita:ditavalPath
plt1683835213376.ditaval
dita:id
rkb1531260709148
Product Category
Teradata Vantage
  1. Train the data without the callbacks argument.
    >>> opt_s = td_lightgbm.train(params={}, train_set=obj_s, num_boost_round=30, valid_sets=[obj_s])
    >>> type(opt_s)
    teradataml.opensource._lightgbm._LightgbmBoosterWrapper
  2. Deploy the model.
    >>> opt_s.deploy(model_name="lightgbm_deploy_train_single_model")
    Model is saved.
    <lightgbm.basic.Booster object at 0x000...00>
    >>> opt_s.record_evaluation_result # Empty as no record evaluation callback used.
  3. Load the deployed model.
    >>> opt_load = td_lightgbm.load(model_name="lightgbm_deploy_train_single_model")
    >>> opt_load
    <lightgbm.basic.Booster object at 0x000...E0>
  4. Predict using the loaded model.
    >>> opt_load.predict(data=df_x_classif, label=df_y_classif)
                col1	             col2	              col3	              col4	 label	  booster_predict_1
     1.0823357639576	 0.84635733604494	-0.012062715650015	 0.812633063515458	     1	 0.9346760906166353
    -0.7745167656447	 1.03844942569731	-0.258906316647375	 0.092392283225207	     1	 0.9704807283586632
    -0.9709790567905	 0.29023691666418	-0.159962241726072	-0.298832196718898	     1	 0.9747562930550571
    -1.1673562332519	 0.10485969688830	-0.152596373567538	-0.459316049285644	     1	 0.9527103200682584
    -1.4168228235536	-1.10436212447853	 0.015212098341070	-1.062313493751677	     0	 0.1671869702680136
     1.0246109854249	-1.42517183237691	 0.350872770060749	-0.143296130974637	     0	 0.0307264091286578
     0.7595463230046	 0.04537144459357	 0.080802080624812	 0.345420111262436	     0	 0.0507321960512050
     0.6469851931442	-0.58122848711195    0.169697726509673	 0.040145692994821	     0	 0.0341349264570978
    -1.4719082500811	-0.02919489258849	-0.166141412269092	-0.645309128519277	     1	 0.8729191084916251
    -1.1750967374648	-0.95074511349926	 0.018279571156030	-0.895335003873629	     1	 0.5546785905584524
  5. Create a single model with record_evaluation callback
    >>> rec = {}
    
  6. Train with valid_sets and callbacks argument.
    >>> opt1 = td_lightgbm.train(params={}, train_set = obj_s, num_boost_round=30,
                                 callbacks=[td_lightgbm.record_evaluation(rec), td_lightgbm.early_stopping(3)],
                                 valid_sets=[obj_s])
    >>> opt1.record_evaluation_result
    {'valid_0': OrderedDict([('l2',
                   [0.21581071275252509,
                    0.18813848372931546,
                    ...
                    ...
                    0.04169529314351532])])}
  7. Deploy the model.
    >>> opt1.deploy(model_name="lightgbm_deploy_train_single_model_with_callback")
    Model is saved.
    <lightgbm.basic.Booster object at 0x00..8B0>
  8. Load the deployed model.
    >>> opt1_load = td_lightgbm.load(model_name="lightgbm_deploy_train_single_model_with_callback")
    >>> opt1_load.record_evaluation_result # Deploy will not save wrapper attributes. It just saves underlying model object.
    >>> opt1_load.predict(df_x_classif)
                  col1	           col2	             col3	             col4	 booster_predict_1
    -0.697767009551012	2.3918078347398	-0.47022244995057	0.680153033261556	0.9798849376620596
    -0.733263307298235	1.9773705102473	-0.40690381676384	0.495003202510811	0.9699154086912621
    0.9577016978304661	-1.410567531062	0.340727932762679	-0.16610004792191	0.0295051822937243
    -1.083977873075923	1.8792459682678	-0.43165520428563	0.303874588671142	0.9832371207930688
    -0.655697611869257	-0.708487381682	0.039161396505109	-0.57254406467090	0.3199544221250283
    -0.652532094184103	1.3294307289508	-0.29209389759953	0.264152749130696	0.9732308238382781
    -1.537164191873611	-0.750650223194	-0.05631805152509	-0.96910962312450	0.6395031001506465
    0.9082253663109264	-1.169232609988	0.295712076085351	-0.08846679694776	0.0159457535568849
    -0.741761812488457	-1.319910570598	0.128663746621739	-0.86019632384497	0.1768105684264068
    1.4588391821477944	0.6278710266435	0.067203725213036	0.885080711754702	0.6711517783816373