17.10 - When Workload Throttles Prevent Full Resource Usage - Advanced SQL Engine - Teradata Workload Management

Teradata Vantageā„¢ - Workload Management User Guide

Advanced SQL Engine
Teradata Workload Management
Release Number
Release Date
July 2021
Content Type
User Guide
Publication ID
English (United States)

Workload throttles limit the number of requests that can run concurrently in a workload to prevent overutilization of system resources. However, in some cases, workload throttles can prevent full resource utilization. The Enable Flex Throttles option, available for SLES 11 systems with TASM for Enterprise Data Warehouse platforms, can detect situations in which workload throttle limits are preventing full resource utilization, and release work from the delay queue to use those resources. You can enable the flex throttles option for all states or enable and disable it by state.

When setting flex throttles, you define the following things:

  • The Flex Throttle Action Interval (when there are available resources, how long TASM waits after releasing requests from the delay queue before repeating the action)
  • The events that trigger flex throttle actions
  • What happens when the triggering events occur

You can enable the flex throttles option in evaluation mode before activating it. In evaluation mode, no requests are released when the triggering event occurs. Instead, the TDWM event log lists which requests would have been released if the option had been activated.

The flex throttles option overrides only workload throttles. It does not override any of the following constraints:
  • Object or system throttles
  • Group throttles
  • Utility throttle limits or AWT resource limits
  • A workload throttle limit of 0
If you enable flex throttles, choose the default option for Define Available AWTs in the Other tab of the General view. Choosing the AWTs available for the WorkNew (Work00) work type option ensures that there are enough reserved AWTs to start the work flex throttles releases from the delay queue.