Targets supporting a database column structure map fields from Teradata Listener to specific columns. All targets, except HDFS types, support mapping fields. You can map data to columns using passthrough, shredding through user-provided mapping, or automatic mapping.
Property | Type | Description | Example |
---|---|---|---|
target_subtype | string | Target subtype. Only value supported is querygrid. | querygrid |
foreign_server | string | Foreign server previously created using QueryGrid link between Listener and Teradata nodes. Required field if target_stubtype is querygrid. | your_foreign_server |
Passthrough Mapping
Listener uses a data_map.mapping array to wrap messages with the following metadata. The metadata you map is stored in the target. The array requires an object with three properties: column, field, and type.
You can use the following fields in the Listener interface to map metadata.
Field | Description |
---|---|
data | Field with the raw data sent to Ingest. Supported data types for QueryGrid include:
If not using QueryGrid, CLOB, is also supported. |
source_id | Source ID to which the packet was sent. |
time | Time the packet was inserted into the target system. |
uuid | UUID gets assigned to the packet upon arrival. |
The data_map.mapping array requires that you link the fields described above to any of the Teradata database columns described below. You must also provide the type.
Column | Type |
---|---|
records | json |
stream | varchar |
timestamp | timestamp |
id | varchar |
{ "data_map": { "mapping": [ { "column": "customer_data", "field": "data", "type": "json" }, { "column": "customer_source_id", "field": "source_id", "type": "varchar" }, { "column": "timestamp", "field": "time", "type": "timestamp" }, { "column": "id", "field": "uuid", "type": "varchar" } ] } }
- NOT NULL columns with default values do not have to be mapped.
Shredding with User-Provided Mapping
Listener accepts a custom mapping of json data fields to Teradata Database columns and shreds data to the appropriate columns based on the mapping provided with the create target call.
Shredding is supported for only Teradata targets.
- Column: Teradata Database column
- Field: json field name that occurs in the data
- Type: Type of data
{ "bundle": false, "bundle_interval": 0, "bundle_type": "records", "data_map": { "mapping": [ { "column": "thingId", "field": "idField", "type": "BIGINT" }, { "column": "aspectName", "field": "aname", "type": "VARCHAR" }, { "column": "time_stamp", "field": "created", "type": "TIMESTAMP" }, { "column": "value_num", "field": "num", "type": "INTEGER" }, { "column": "status", "field": "state", "type": "FLOAT" } ], "mapping_type": "shredding" }, "data_path": { "schema": <teradata-schema-name>, "table": <teradata-table-name> }, "description": "TestTeradataTarget", "name": <target-name>, "source_id": <source-id>, "state": "1", "system_id": <system-id>, "system_info": { "password": <teradata-db-password>, "table": <teradata-table>, "username": <teradata-db-username> }, "target_type": "teradata" }
Teradata Column Type | Corresponding "type" in Target Payload | Format | Example | Limitations |
---|---|---|---|---|
BIGINT | BIGINT | 123456789 | 123456789 | Error occurs for invalid values. |
123456789.565865 | 123456789 | |||
-9223372036854770000 (minimum value) | -9223372036854770000 | |||
9223372036854775807 (maximum value) | 9223372036854775807 | |||
"hyderabad" | Error (Bad character in format or data) | |||
9923372036854775809 (beyond maximum value) | Error (Numeric value <input-value> out of long range (-9223372036854775808 - 9223372036854775807)) | |||
BLOB | BLOB | 111100 | 111100 | |
Value exceeding column length | Trimmed to column length | |||
1B29MX (invalid character) | Error (Failed parsing hex json string) | |||
1B2 (odd number of characters) | Error (Failed parsing hex json string) | |||
64K bytes | Stored to DB. | |||
Over 64K bytes | Error | Autoshred with BLOB data type does not support more than 64K bytes. | ||
BYTE | BYTE | For BYTE(3): | ||
122BCD | 122BCD | 64K Bytes and Invalid 64K Bytes cases will fail and cause an error denoting invalid input. | ||
ABCD | ABCD00 | |||
1B29BCA9 (beyond limit) | 1B29BC | |||
empty | 000000 | |||
1B2 (odd hex) | Failed parsing hex JSON string | |||
1B29MX (invalid) | Failed parsing hex JSON string | |||
Over 64K VarBytes | Parameter length is xxxxxxx bytes, which is greater than the maximum 64000 bytes that can be set. | |||
64K VarBytes | ||||
Invalid 64K Varbytes | ||||
BYTEINT | BYTEINT | 127 | 127 | |
-128 | -128 | |||
21.83 | 21 | |||
32769 (Beyond Max value) | Error (Overflow occurred computing expression involving TABLE_NAME.COLUMN_NAME) | |||
CHAR | CHAR | For CHAR(5): | ||
BOBBY | BOBBY | |||
A | A | |||
SAM | SAM | |||
123 | 123 | |||
ROCKSTAR (beyond limit) | ROCKS (Truncates the additional characters) | |||
DATE | DATE | YYYY-MM-DD | 2017-12-26 | |
YYYY/M/DD | 2017/12/26 | |||
Invalid:
|
Invalid date formats cause the following error: Invalid date supplied for <table-name>.<column-name> For example:Invalid date supplied for doc_shred.i_am_a_date. |
|||
DECIMAL(N,M) | DECIMAL | For DECIMAL(15,5): | ||
78945.326 | 78945.326 | |||
-78945.326 | -78945.326 | |||
9999999999.99999 | 9999999999.99999 | |||
999999999965.56565 (beyond limit) | Error: Overflow occurred computing an expression involving <tablename>.<columnname> | |||
invalid | Error: Bad character in format or data of <table-name>.<columnname> | |||
FLOAT | FLOAT | 791E+308 | 791E+308 | |
26E-308 | 26E-308 | |||
14E-10 | 14E-10 | |||
1.791E+310 (Beyond Limit) | Error (Bad character in format or data) | |||
invalid (Invalid) | Error (Bad character in format or data) | |||
INT | INT | 123456789 | 123456789 | |
123456789.565865 | 123456789 | |||
-2147483648 (minimum value) | -2147483648 | |||
2147483646 (maximum value) | 2147483646 | |||
2147483700 (beyond maximum value) | Error (Overflow occurred computing an expression involving <table-name>.<column-name>) | |||
"hyderabad" | Error (Overflow occurred computing an expression involving <table-name>.<column-name>) | |||
SMALLINT | SMALLINT | 16000 | 16000 | |
21474.83646 | 21474 | |||
-32768 (minimum value) | -32768 | |||
32767 (maximum value) | 32767 | |||
"hyderabad" | Error (Overflow occurred computing an expression involving <table-name>.<column-name>) | |||
32769 (beyond maximum value) | Error (Overflow occurred computing an expression involving <table-name>.<column-name>) | |||
TIMESTAMP(6) | TIMESTAMP | YYYY-MM-DD hh:mi:ss.ssssss | 2006-11-23 15:30:23.356899 | When the timestamp (or timestamp with time zone) value provided in ingest is different in separate payload, batch is discarded (marked as error) or moved to the dead-letter queue. For example: "1530626266000" and "2018-07-03 08:58:00.123456+03:00 |
000Y-0M-0D hh:mi:ss.ssssss | 0001-01-03 15:30:23.356558 | |||
YYYY-MM-DD hh:mi:ss | 2006-11-23 15:30:23.35 | |||
YYYY-MM-DD hh:mi:ss sign hh:mi | 2006-11-23 15:30:23.35+13:00 | |||
YYYY-MM-DD hh:mi:ss:ssssss sign hh:mi | 2006-11-23 15:30:23.356899+12:30 | |||
YYYY-MM-DD T hh:mi:ss.ssssss | 0001-01-03T15:30:23.356558 | |||
YYYY-MM-DD hh:mi:ss.ssss | 2006-11-23 15:30:23.3568 | |||
YYYY/MM/DD hh:mi:ss.ssssss | 2006/11/23 15:30:23.356558 | |||
Upper and Lower Bound: | ||||
YYYY-MM-DD hh:mi:ss.ssssss | 0001-01-01 03:04:05.123456 | |||
YYYY-MM-DD hh:mi:ss.ssssss | 9999-12-31 12:59:59.999999 | |||
Invalid: | ||||
2006-11-23 15:30:23.3598968 | Invalid date formats cause the following error message: Failed to parse timestamp |
|||
2006-1-3 15:30:23.356558 | ||||
test | ||||
2006-11-23 15:30:23.3598968 | ||||
TIMESTAMP(6) WITH TIMEZONE | TIMESTAMP WITH TIMEZONE | YYYY-MM-DD hh:mi:ss | 2006-11-23 15:30:23 | |
YYYY-MM-DD hh:mi:sssignhh:mi |
2006-11-23 15:30:23+13:00 |
|||
YYYY-MM-DD hh:mi:ss.ssssss |
2006-11-23 15:30:23.356899 | |||
YYYY-MM-DD hh:mi:ss.sssssssignhh:mi |
2006-11-23 15:30:23.356899+12:30 | |||
YYYY-MM-DD hh:mi:ss.sssssssignhh:mi |
2006-11-23 15:30:23.356899-12:30 | |||
YYYY-MM-DD(T)hh:mi:ss.sssssssignhh:mi | 2006-11-23T15:30:23.356899+12:30 | |||
YYYY-MM-DD hh:mi:ss.sssssignhh:mi | 2006-11-23T15:30:23.3568+12:30 | |||
YYYY/MM/DD hh:mi:ss.sssssssignhh:mi | 2006/11/23 15:30:23.356899+12:30 | |||
Upper and Lower Bound | ||||
YYYY-MM-DD hh:mi:ss.sssssssignhh:mi | 9999-12-31 12:59:59.999999+03:00 | |||
YYYY-MM-DD(T)hh:mi:ss.ssssssssignhh:mi | 0001-01-01 03:04:05.123456+03:00 | |||
Invalid | ||||
2006-1-3 15:30:23.3568999+12:30 | Invalid date formats cause the following error message: Could not be parsed, unparsed text found |
|||
InvalidTimeStamp | ||||
2006-11-23 15:30:23.3568999+12:30 | ||||
2006-11-23 15:30:23.3567899+12:30 | ||||
VARBYTE | VARBYTE | For VARBYTE(3): | 64K VarBytes and Invalid 64K VarBytes cases will fail and cause an error denoting invalid input. | |
122BCD | 122BCD | |||
ABCD | ABCD | |||
1B29BCA9 (Beyond) | 1B29BC | |||
empty | 000000 | |||
1B2 (odd hex) | Failed parsing hex json string | |||
1B29MX (invalid) | Failed parsing hex json string | |||
Over 64K VarBytes | Parameter length is xxxxxxx bytes, which is greater than the maximum 64000 bytes that can be set. | |||
64K VarBytes | ||||
Invalid 64K VarBytes | ||||
VARCHAR(N) | VARCHAR | Variable length character string of length 0 to n | String lengths greater than defined value are trimmed to column length. |
curl \
-H "Content-Type: application/json" \
-H "Authorization: Bearer TOKEN" \
-X POST \
-d '{
{
"source_id": "4d63e2c7-066e-45b3-a928-061bed772295",
"system_id": "1489789b-a4e3-4d86-9761-b41503ab5d6a",
"name": "Sample Teradata QueryGrid Target with Shredding",
"description": "",
"target_type": "teradata",
"data_path": {
"foreign_server": "Listener",
"schema": "listener",
"table": "test_varchar"
},
"system_info": {
"host": "my_host",
"password": "my_password",
"port": "my_port",
"system_type": "teradata",
"table": "test_varchar",
"target_subtype": "querygrid",
"username": "my_username"
},
"production": false,
"bundle": false,
"bundle_type": "records",
"bundle_interval": 0,
"data_map": {
"mapping": [
{
"column": "data",
"field": "data",
"type": "VARCHAR"
},
{
"column": "source_id",
"field": "source_id",
"type": "VARCHAR"
},
{
"column": "time_stamp",
"field": "time_stamp",
"type": "TIMESTAMP WITH TIME ZONE"
},
{
"column": "uuid",
"field": "uuid",
"type": "VARCHAR"
}
],
"mapping_type": "shredding"
},
"properties": null,
"use_dead_letter_queue": false,
"dead_letter_queue": ""
}
' \
-i \
https://CLUSTER.DOMAIN/listener/appservices/targets
Teradata Column Type | Corresponding "type" in Target Payload | Format | Example | Limitations |
---|---|---|---|---|
BIGINT | BIGINT | 1232516 | 1232516 | |
"hyderabad" | no record in db | |||
123.12 | null | |||
9923372036854775809 (beyond maximum value) |
no record in db | |||
-9223372036854770000 (minimum value) |
-9223372036854770000 | |||
9223372036854775807 (maximum value) |
9223372036854775807 | |||
Date | Date DATE FORMAT 'yyyy-mm-dd' | YYYY-MM-DD | 2017-12-08 | Invalid values are inserted as null. For example: 18/1/2 , abcd |
YYYY/M/DD | 2017/02/08 | |||
DECIMAL(N,M) | DECIMAL | For DECIMA(15,5): | Invalid values are inserted as null. Valid values beyond range are also inserted. |
|
78945.326 | 78945.326 | |||
-78945.326 | -78945.326 | |||
9999999999.99999 | 9999999999.99999 | |||
999999999965.56565 (beyond limit) |
||||
invalid | ||||
FLOAT | FLOAT | 2.26E-308 | 2.26E-308 | Invalid values (string that cannot be parsed) are inserted as null. Values beyond range are inserted as "∞" (infinity). For example: 1.791E+310 |
1.791E+308 | 1.791E+308 | |||
INT | INT | 1232 | 1232 | |
123.22 | null | |||
2147483700 (beyond max value) | -2147483596 | |||
"hyderabad" | no record in db | |||
-2147483648 (minimum value) | -2147483648 | |||
2147483646 (maximum value) | 2147483646 | |||
SMALLINT | SMALLINT | 14788 | 14788 | |
123.2 | null | |||
"hyderabad" | no record in db | |||
32769 (beyond maximum value) | -32767 | |||
-32768 (minimum value) | -32768 | |||
32767 (maximum value) | 32767 | |||
TIMESTAMP(6) | TIMESTAMP | yyyy-MM-dd HH:mm:ss[.N{0,6}] | 2018-01-01 01:02:03 | Invalid values are inserted as null. TimeZone is ignored during insert. Value of ingest data "2018-01-02 04:05:06.123456+04:00" is stored as: "2018-01-02 04:05:06.123456+00:00" |
yyyy-MM-dd'T'HH:mm:ss[.N{0,6}] | 2018-01-01 01:02:03.123 | |||
yyyy/MM/dd HH:mm:ss[.N{0,6}] | 2018-01-01T01:02:03.123 | |||
NNNNNNNNNNNNN (milliseconds since January 1, 1970, 00:00:00 GMT) | 1530823567000 | |||
TIMESTAMP(6) WITH TIMEZONE | TIMESTAMP WITH TIMEZONE | yyyy-MM-dd HH:mm:ss[.N{0,6}][XXX] | 2018-01-01 01:02:03 | For QueryGrid implementation, when table column type is "timestamp(6) with timezone" timezone is ignored. For example: Value of ingest data "2018-01-02 04:05:06.123456+04:00" will be stored as "2018-01-02 04:05:06.123456+00:00" in table column of type "timestamp(6) with timezone". Invalid values are inserted as null. |
yyyy-MM-dd'T'HH:mm:ss[.N{0,6}][XXX] | 2018-01-01 01:02:03.123 | |||
yyyy/MM/dd HH:mm:ss[.N{0,6}][XXX] | 2018-01-01 01:02:03.123+03:00 | |||
NNNNNNNNNNNNN (milliseconds since January 1, 1970, 00:00:00 GMT) | 2018-01-01T01:02:03.123 | |||
VARCHAR(N) | VARCHAR | Variable length character string of length 0 to n. | String length beyond defined value is trimmed to column length. |
{ "data_map": { "mapping_type”: "shredding" } }
{ "idField": "5020", "aname": "aspect_name15", "created": "2018-01-03 22:35:50", "num": "115", "state": "1" }
Field (json) | Teradata Database Column | Teradata Database Column Type | Content |
---|---|---|---|
idField | thingId | bigint | 5020 |
aname | aspectName | varchar | aspect_name15 |
created | time_stamp | timestamp |
|
num | value_num | integer | 115 |
state | status | float | 1 |
- NOT NULL columns can be mapped to metadata fields.
- NOT NULL columns with default values can be left unmapped.
- NOT NULL columns cannot be mapped to non-metadata fields, even if the column has default values.
Automatic Mapping
Automatic mapping binds JSON values to specific data types in Teradata Database columns. Listener parses each message as a JSON document and maps the parsed fields to Teradata Database columns.
Automatic mapping is exposed in Listener through the data_map.mapping_type using auto_shred. The writer parses the JSON and makes the following fields and values available for binding to the target database table.
{ "data_map": { "mapping_type" : "auto_shred" } }
{ "country" : "France", "population" : 66991000, "famous_for" : ["Wine", "Cheese"], "restaurants" : { "French" : 63, "Italian" : 25, "Vegan" : 9, "Other" : 3, }, "monarch" : null }
Field | Type | Contents |
---|---|---|
country | Parsed | France |
population | Parsed | 66991000 |
famous_for | Parsed | ["Wine","Cheese"] |
restaurants | Parsed | {"French": 63, "Italian": 25, "Vegan": 9, "Other": 3} |
monarch | Parsed | null |
Table Column | Value | Description |
---|---|---|
COUNTRY | France | Matches country field through case insensitive comparison. |
UUID | UUID associated with packet | Matches uuid field through case insensitive comparison. |
TIMEZONE | null | No value; defaults to null. |