In addition to physical database design choices, there are several application opportunities that benefit tactical query performance. The main areas of interest are multistatement requests and macros.
Multistatement Requests
Multistatement requests are a database feature in which multiple SQL statements are bundled together and treated as a single parsing and recovery unit, as illustrated by the following graphic:
Because they are run in parallel, these single-AMP multistatement requests are processed with great efficiency. Multistatement requests are application-independent and can improve performance in a variety of ways. They are particularly useful in improving response times for the combined statements.
- Communication overhead is reduced
- One parser-optimization process is performed instead of several
- Greater inter-request parallelism is possible
Coding Multistatement Requests
Multistatement requests are application-independent, but their behavior can be different depending on whether they are specified as an implicit transaction or as part of an explicit transaction and whether they are submitted in Teradata or ANSI/ISO session mode.
Vantage treats multistatement requests as single implicit transactions in Teradata session mode when no open BEGIN TRANSACTION statement precedes them.
With the exception of multistatement INSERT requests, Vantage treats multistatement requests as single recovery units only if they are executed either as implicit transactions in Teradata session mode, or in ANSI/ISO session mode. Depending on any errors generated by the statements in the transaction, either the entire transaction or only the erring request is rolled back. For example, in Teradata session mode within an explicit transaction, Vantage rolls the entire transaction back to the BEGIN TRANSACTION statement, so in this case, the complete transaction is the recovery unit.
If several UPDATE requests are specified within one multistatement request inside an explicit transaction in Teradata session mode, and one of them fails, then all are rolled back. The fewer UPDATE statements in the multistatement request, the lower the impact of the rollback. However, the more statements included in the request, the higher the degree of parallelism among them.
The rollback issue for UPDATE requests is not true for multistatement INSERT requests, where the statement independence feature can frequently enable multistatement INSERT requests to roll back only the statements that fail within an explicit transaction or multistatement request and not the entire transaction or request.
- Column-level CHECK constraint violations
- Data translation errors
- Duplicate row errors for SET tables
- Primary index uniqueness violations
- Referential integrity violations
- Secondary index uniqueness violations
- Triggers
- Hash indexes
- Join indexes
See the information about INSERT and INSERT SELECT in Teradata Vantage⢠- SQL Data Manipulation Language, B035-1146 for more information about statement independence. Note that various client data loading utilities also support statement independence. Consult the appropriate Teradata Tools and Utilities documentation for information about which load utilities and APIs support statement independence and what level of support they offer for the feature.
ACCESS Locking and Multistatement Requests
If you want each statement in a multistatement request to use row hash-level ACCESS locking, specify the LOCKING ROW FOR ACCESS modifier preceding each individual SELECT statement in the multistatement request. See Coding Multistatement Requests for additional details about how Vantage handles transactions, particularly those containing multistatement requests.
Macros and Tactical Queries
You automatically get a multistatement request without coding it if you write multiple SQL statements within a macro because macros are always performed as a single request. The macro code can also specify parameters that are replaced by data each time the macro is performed. The most common way of substituting for the parameters is to specify a USING request modifier (see Teradata Vantage⢠- SQL Data Manipulation Language, B035-1146).
Macros are usually more efficient for repetitive queries than single DML statements. Unlike procedures, macros are not designed to return parameters to the requestor: they return only the answer set for the SQL statements they contain.
- Network and channel traffic are reduced.
- Execution plans can be cached, reducing parsing engine overhead.
- They ensure the efficient performance of their component SQL statements.
- Reusable database components save client resources.
- They can be used to force data integrity and locking modifiers.
- They can be used to enforce security.
Cached Plans
Caching of repeatable requests in the Request-to-Step Cache reduces performance time because parsing and optimizing do not need to be done when cached requests are repeated. For subsecond queries, cached plans significantly reduce the query time and enhance throughput. The following figure is a high-level flow chart for the request-to-steps cache that resides in each parsing engine on a Teradata system.
- Host, workstation, or LAN type
- Default database name
- National character set
- Request text
Parameter variables in the macro support repeatability. These parameters must be specified in both the SQL code itself and in the USING request modifier that precedes the EXEC statement that invokes the macro.
The following picture illustrates the conditions required for caching the SQL statements for a macro. In this example, the request is made using BTEQ from a mainframe client across a block multiplexor channel. Note that the SQL code is in a request parcel, while the data values are transmitted in a data parcel. Both a request and a data parcel must be transmitted for the SQL to be cached immediately.
The data values are passed from the application to the macro code where they are then processed. Their format depends on the programming conventions for the application. For example, in BTEQ you must use the .IMPORT FILE = command when you perform the macro, and the import file specified in the command contains records with the data values to be passed to the macro. The values in the file are positionally associated, from left to right, with the parameters in the USING request modifier.