Optimizing Database Imports
Follow these guidelines to optimize your imports:
-
Use a large buffer size. The BUFFER parameter determines the number of rows in the array to be inserted by the import. The larger the buffer size, the better the performance. The setting for BUFFER is influenced by the amount of memory on your system, so be careful not to introduce paging (by setting it too high).
-
Don't use COMMIT=Y. When COMMIT=N, the commit takes place after each object load instead of each array insert. This improves performance but requires that you use large rollback segments.
-
Use a large rollback segment, and put other rollback segments offline to ensure that the import uses a large rollback segment. Using a large rollback segment and about 20 extents minimizes dynamic space management.
-
Do a backup and disable archiving. Putting the database in NOARCHIVELOG mode is a good idea because Import generates a lot of redo that doesn't need to be recorded. A backup also should be done at this point because you aren't planning to use ARCHIVELOG mode.
-
Use large online redo logs, and reduce the checkpoint frequency. This can be done by increasing the value of the LOG_CHECKPOINT_INTERVAL parameter. This will help in conserving system resources.
-
Pre-create the tables without the indexes and constraints, load the data, and then create the indexes and constraints. Use the INDEXFILE parameter to help in index creation.
-
Minimize I/O contention by placing the export file on a disk other than the data files.
-
Schedule the import during periods of the least database activity.
-
Import from a disk rather than directly from a tape.
-
Use a large value for the SORT_AREA_SIZE parameter to speed up the index creation process.
Export/Import may be an important part of your overall strategy for transfer and protection of data from or to Oracle databases. This article has discussed several techniques that can be used for speeding up the entire process.