This use case shows how to use teradataml functions to prepare data for analysis, and then run analytics with In-DB Python interpreters and add-on packages through the SCRIPT Table Operator (STO).
The datasets used in this use case are CSV files that we will upload to Vantage using Teradata Studio.
In the attached zip file, the Jupyter notebook includes the following sections:
- Section 0: Loading teradataml modules and connecting to Vantage from the client
- Section 1: Data manipulation and transformation: Uploading Data Files using Teradata Studio and then Preparing Data for Analysis using teradataml functions
- Section 2: Scoring using Script Table Operator
Before running the example Jupyter notebook AnalyzingSTO.ipynb for this use case, extract the attached zip file DataScience-Python_UseCases.zip and make the following changes:
- In the Jupyter notebook AnalyzingSTO.ipynb:
- Replace the <host>, <username>, <password> and <database> with the actual host, username, password and database for your system;
- Replace the <files_local_path> with the actual local path where you saved the Python script file.
- In the Python script file ex1pSco.py, replace the <database> with the actual database for your system.