Rules and Restrictions - Advanced SQL Engine - Teradata Database

DATASET Data Type

Product
Advanced SQL Engine
Teradata Database
Release Number
17.05
17.00
Published
June 2020
Language
English (United States)
Last Update
2021-01-23
dita:mapPath
des1556232910526.ditamap
dita:ditavalPath
lze1555437562152.ditaval
dita:id
B035-1198
lifecycle
previous
Product Category
Teradata Vantageā„¢
When CSV_TO_JSON converts CSV data into JSON data type instances, the input consists of multiple sets of CSV data with these expected behaviors:
  • If the data is aggregated, the structure must be identical.
  • If the data is not aggregated, the structure does not have to be identical. However, if the SCHEMA custom clause specifies field names in the output JSON documents, the numbers of specified names and fields in every record of every CSV data set must be equal.
CSV_TO_JSON produces one output column called 'data'. You can tune the output based on the following custom clauses:
  • Specify SCHEMA to explicitly define the published data structure. Use the clause with a character string representing an ad-hoc schema specification; any other value results in an error. If an ad-hoc schema is specified, the structure must conform to the CSV format rules. If the clause is not specified, the input CSV data is assumed to conform to the defaults.
  • Specify DO_AGGREGATE output instances to compose input data as one row. The clause accepts either Y (the result is aggregated) or N (the result is not aggregated, which is the default). Neither option is case-sensitive. If the clause is excluded, the table operator returns one JSON instance in one Teradata output row for each record for each set of input CSV data. If the aggregated data results in a size overflow based on the maximum size specified, an error occurs.

The table operator uses the RETURNS clause to compose data into a JSON data type of any character set or storage format. The default is to return maximum size JSON CHARACTER SET LATIN instances. All values are treated as strings, except NULL fields, which are converted to null JSON values.

Fields in CSV data may be wrapped in double quotes, especially when the field contains characters used as a field or record delimiter. When CSV_TO_JSON encounters a double-quoted field, it outputs the key or value without the extra quotes. See Example: Using CSV Data in Double Quotes (CSV to JSON) to view a record delimiter containing the CSV data field name, and to see that the extra quotes are removed before constructing the JSON document.

Note that commas at the end of a record with no data (except a record delimiter) following them are ignored. The commas are not interpreted as a null value.