Multiple Instances with S3MaxObjectSize|Teradata Access Module for S3 - Multiple Instances with S3MaxObjectSize - Access Module

Teradata® Tools and Utilities Access Module Reference - 20.00

Deployment
VantageCloud
VantageCore
Edition
VMware
Enterprise
IntelliFlex
Lake
Product
Access Module
Release Number
20.00
Published
October 2023
ft:locale
en-US
ft:lastEdition
2025-11-21
dita:mapPath
cya1691484517272.ditamap
dita:ditavalPath
obe1474387269547.ditaval
dita:id
hjf1479308836950
Product Category
Teradata Tools and Utilities

If the goal is to take advantage of TPT's parallelism and limit the size of each object as it is written to S3, the same naming convention will be used as described in Multiple Instances. The objects are processed as follows.

Assume you have 50 MB of data and would like to write the objects to S3 with each object not exceeding 10 MB each, and you have specified 2 instances of the DataConnector operator (using S3Object=my_load_job):

Instance 1 will create:

  • my_load_job-001
  • my_load_job-003
  • my_load_job-005

Instance 2 will create:

  • my_load_job-002
  • my_load_job-004