Number of Reads for Inserting 10, 100, and 1,000 NUPI Duplicate Rows
Note that a single order of magnitude increase in the number of duplicate NUPI row inserts into a table, from 10 to 100, results in a two orders of magnitude increase in the cumulative number of block reads that must be performed.
When duplicate NUPI inserts increase by another order of magnitude, from 100 to 1,000, there is an additional increase of four orders of magnitude over the cumulative number of reads required to insert 10 duplicate NUSI rows.
The count of cumulative read operations in the following table is a measure of data block reads, not individual row reads. For example, if there are 100 rows in a data block, then only one block read is required to retrieve all 100 rows.
To insert this many duplicate NUSI rows … |
The system must perform this many cumulative data block read operations … |
||
N |
log10N |
N |
log10N |
10 |
1 |
45 |
1.6532 |
100 |
2 |
4,950 |
3.6946 |
1,000 |
3 |
499,500 |
5.6985 |