Inserting Into Queue Tables Using Iterated Requests
If you use an INSERT request in an iterated request to insert rows into a queue table, you might have to limit the number of data records with each request to minimize the number of rowhash-level WRITE locks placed on the table and reduce the likelihood of deadlocks occurring because of resource conflicts between the locks and the all-AMPs table-level READ lock exerted by the internal row collection processing used by queue tables to update the internal queue table cache.
IF you use an INSERT request in an iterated request to insert rows into a queue table and … |
THEN … |
all of the following conditions are true: |
the number of data records that you pack with each request is not an issue. |
any of the following conditions are true: |
pack a maximum of four data records with each request. For example, if you use BTEQ to import rows of data into a queue table, use a maximum value of 4 with the BTEQ .SET PACK command. These conditions trigger the internal row collection processing used by queue tables to update the internal queue table cache. |
For details on queue tables and the queue table cache, see “CREATE TABLE” in SQL Data Definition Language.