Variable Substitutions for Examples - Advanced SQL Engine - Teradata Database

Teradata Vantage™ - Native Object Store Getting Started Guide

Product
Advanced SQL Engine
Teradata Database
Release Number
17.10
Published
July 2021
Language
English (United States)
Last Update
2022-06-22
dita:mapPath
gmv1596851589343.ditamap
dita:ditavalPath
wrg1590696035526.ditaval
dita:id
B035-1214
lifecycle
previous
Product Category
Software
Teradata Vantage

The variables in the examples must be replaced with values that allow access to your external object store.

authorization_object

Replace authorization_object with the name of the authorization object. You may use any name for the authorization object that conforms to object naming rules, see Teradata Vantage™ - SQL Fundamentals, B035-1141.

table_name

Replace table_name with a name for your foreign table that conforms to object naming rules, see Teradata Vantage™ - SQL Fundamentals, B035-1141.

YOUR-OBJECT-STORE-URI

When you run the examples, replace YOUR-OBJECT-STORE-URI with the location of your object store

If you run the examples that use the Teradata-supplied public buckets or containers, the following table lists the paths to use to access the sample USGS river flow data. Replace YOUR-OBJECT-STORE-URI in the examples to read the data from the public buckets and containers:

Platform Public Storage Location
Amazon S3
  • CSV: /s3/td-usgs-public.s3.amazonaws.com/CSVDATA/
  • JSON: /s3/td-usgs-public.s3.amazonaws.com/JSONDATA/
  • Parquet: /s3/td-usgs-public.s3.amazonaws.com/PARQUETDATA/
Azure Blob storage
  • CSV: /az/akiaxox5jikeotfww4ul.blob.core.windows.net/td-usgs/CSVDATA/
  • JSON: /az/akiaxox5jikeotfww4ul.blob.core.windows.net/td-usgs/JSONDATA/
  • Parquet: /az/akiaxox5jikeotfww4ul.blob.core.windows.net/td-usgs/PARQUETDATA/
Google Cloud Storage
  • CSV: /gs/storage.googleapis.com/td-usgs/CSVDATA/
  • JSON: /gs/storage.googleapis.com/td-usgs/JSONDATA/
  • Parquet: /gs/storage.googleapis.com/td-usgs/PARQUETDATA/

If you downloaded the USGS river flow data to your own external object store (see Setting Up an Object Store for River Flow Data), replace YOUR-STORAGE-ACCOUNT with the path to your data, which will be unique, but similar to the following:

Object Store External Storage Location Values
Amazon S3
The examples in this document use "td-usgs" for the bucket name. Your bucket name must be unique.
CSV: /s3/YOUR-BUCKET.s3.amazonaws.com/td-usgs/CSVDATA/

For example: /s3/td-usgs.s3.amazonaws.com/td-usgs/CSVDATA/

JSON: /S3/YOUR-BUCKET.s3.amazonaws.com/td-usgs/JSONDATA/

For example: /s3/td-usgs.s3.amazonaws.com/td-usgs/JSONDATA/

Parquet: /S3/YOUR-BUCKET.s3.amazonaws.com/td-usgs/PARQUETDATA/

For example: /s3/td-usgs.s3.amazonaws.com/td-usgs/PARQUETDATA/

Azure storage
Replace YOUR-STORAGE-ACCOUNT with the value for your account.
CSV: /az/YOUR-STORAGE-ACCOUNT.blob.core.windows.net/YOUR-CONTAINER/td-usgs/CSVDATA/

For example: /az/akiaxox5jikeotfww4ul.blob.core.windows.net/td-usgs/CSVDATA/

JSON: /az/YOUR-STORAGE-ACCOUNT.blob.core.windows.net/YOUR-CONTAINER/td-usgs/JSONDATA/

For example: /az/akiaxox5jikeotfww4ul.blob.core.windows.net/YOUR-CONTAINER/td-usgs/JSONDATA/

Parquet:
/az/YOUR-STORAGE-ACCOUNT.blob.core.windows.net/YOUR-CONTAINER/td-usgs/PARQUETDATA/

For example: /az/akiaxox5jikeotfww4ul.blob.core.windows.net/YOUR-CONTAINER/td-usgs/PARQUETDATA/

Google Cloud Storage CSV: /gs/storage.googleapis.com/YOUR-BUCKET/CSVDATA/

For example: /gs/storage.googleapis.com/td-usgs/CSVDATA/

JSON: /gs/storage.googleapis.com/YOUR-BUCKET/JSONDATA/

For example: /gs/storage.googleapis.com/td-usgs/JSONDATA/

Parquet: /gs/storage.googleapis.com/YOUR-BUCKET/PARQUETDATA/

For example: /gs/storage.googleapis.com/td-usgs/PARQUETDATA/

Teradata requires the storage location to start with the following:
  • Amazon S3 bucket location must begin with /S3 or /s3
  • Google Cloud Storage bucket location must begin with /GS or /gs
  • Azure Blob storage location (including Azure Data Lake Storage Gen2 in Blob Interop Mode) must begin with /AZ or /az

YOUR-ACCESS-KEY-ID and YOUR-SECRET-ACCESS-KEY

Your external object store must be configured to allow Advanced SQL Engine to access it.

The USER and PASSWORD keywords (used in the CREATE AUTHORIZATION command) and ACCESS_ID and ACCESS_KEY (used by READ_NOS and WRITE_NOS) correspond to the values shown in the following table:

Keyword Access Variables Values
USER or ACCESS_ID Replace YOUR-ACCESS-KEY-ID with the access ID to your external storage.
PASSWORD or ACCESS_KEY Replace YOUR-SECRET-ACCESS-KEY with the secret key value to your external storage.

Public buckets or containers in external object stores do not require credentials for access. To access a public bucket or container, put an empty string between the straight quotes for USER and PASSWORD.