Setting Up an Object Store for River Flow Data - Advanced SQL Engine - Teradata Database

Teradata Vantageā„¢ - Native Object Store Getting Started Guide

Product
Advanced SQL Engine
Teradata Database
Release Number
17.05
Published
January 2021
Language
English (United States)
Last Update
2021-01-22
dita:mapPath
zws1595641486108.ditamap
dita:ditavalPath
hoy1596145193032.ditaval
dita:id
B035-1214
lifecycle
previous
Product Category
Software
Teradata Vantage

Many of the examples use a sample river flow data set. USGS Surface-Water Data Sets are provided courtesy of the U.S. Geological Survey.

To run the examples, you can use the river flow data from Teradata-supplied public buckets. Or you can set up a small object store for the data set.

The following instructions explain how to set up the river flow data on your own external object store.

Your external object store must be configured to allow Advanced SQL Engine access.

When you configure external storage, you set the credentials to your external object store. Those credentials are used in SQL commands. The supported credentials for USER and PASSWORD (used in the CREATE AUTHORIZATION command) and for ACCESS_ID and ACCESS_KEY (used by READ_NOS and WRITE_NOS) correspond to the values shown in the following table:

Platform USER/ACCESS_ID PASSWORD/ACCESS_KEY
Amazon S3 Access Key Secret Access Key
Amazon S3 IAM AWS Identity and Access Management (IAM) Credentials (roles/policies)
Azure Access Key Storage Account Name Key
Azure Account Shared Access Signature (SAS) Storage Account Name SAS Token
Google Cloud Storage Access Key Secure Email and Secret Access Key
Public access object stores <empty string>

Enclose the empty string in single straight quotes: USER ''

<empty string>

Enclose the empty string in single straight quotes: PASSWORD ''

On-premises object stores Access Key Secret Access Key
The following provides further details for setting up credentials on your external object store:
Platform Notes
Amazon S3 IAM IAM is an alternative to using an access key and password to secure S3 buckets. To allow Advanced SQL Engine access to S3 buckets that use IAM, your S3 bucket policy must be configured with the following Actions for the role that allows access to the bucket:
  • S3:GetObject
  • S3:ListBucket
  • S3:GetBucketLocation
Azure Blob storage and Azure Data Lake Storage Gen2 A user with access key information has full control over the entire account. Alternatively, SAS can be defined on Containers, or on objects within containers, so it provides a more fine-grained authentication approach. NOS uses either type of authentication and does not need to know what type of secret is being supplied.
Only Account SAS tokens are supported. Service SAS tokens generate errors and are rejected.
Google Cloud Storage To allow Advanced SQL Engine access, the following permissions are needed:
  • storage.objects.get
  • storage.objects.list

See your cloud vendor documentation for instructions on creating an external object store account.

    The following steps may require the assistance of your public cloud administrator.

  1. Create an external object store on a Teradata-supported external object storage platform. Give your external object store a unique name. In the Teradata-supplied examples, the bucket/container is called td-usgs. Because the bucket/container name must be unique, choose a name other than td-usgs.
  2. On Amazon, generate an access ID and matching secret key for your bucket or generate an Identity and Access Management (IAM) user credential. On Azure, generate Account SAS tokens (not Service SAS tokens) for your td-usgs container. On Google Cloud Storage, generate an access ID and matching secret key for your bucket.
  3. Download the sample data from https://downloads.teradata.com/ (look for NOS Download Data) to your client/laptop. The ZIP file contains sample river flow data in CSV, JSON, and Parquet data formats.
  4. Copy the sample data to your bucket or container, being careful to preserve the data directory structure. For example, use a location similar to the following:
    • Amazon S3: /S3/YOUR-BUCKET.s3.amazonaws.com/JSONDATA
    • Azure Blob storage and Azure Data Lake Storage Gen2: /az/YOUR-STORAGE-ACCOUNT.blob.core.windows.net/td-usgs/CSVDATA/
    • Google Cloud Storage: /gs/storage.googleapis.com/YOUR-BUCKET/CSVDATA/
    Note, you can use the Amazon S3 or Azure management consoles or a utility like AWS CLI to copy the data to your external object store. For Google Cloud Storage, you can use the gsutil tool to copy the data to your external object store.
  5. In the example code replace the bucket or container (shown as td-usgs, YOUR-BUCKET, or YOUR-STORAGE-ACCOUNT) with the location of your object store.
  6. Replace YOUR-ACCESS-KEY-ID and YOUR-SECRET-ACCESS-KEY with the access values for your external object store.