2.5 Processing batches | |
Processing large batches of data has it's own challenges. Inserting a large amount of data into a table can have several negative impacts on performance. One is that there is almost always a write lock on the table as the data is inserted, which impacts search performance. Secondly the disk cache for table and index is continually being invalidated. Thirdly the number of records that will be processed to accommodate Metamorph searches will increase greatly until the end of the update, and the index has been updated.
It is important to make sure that Metamorph indexes are created or updated after the data is loaded. Although the chkind portion of the database monitor will update the index for you, when inserting or updating a large batch of data it may make more sense to disable the automatic update, and do a manual update at the appropriate time.
If you are importing into a new table you should wait until the table is populated to create any indices except those needed to load the data. Typically these would be unique indices if the data were not known to be unique.
Back: Other Import Methods | Next: Handling continuous streams |