Beyond the command-line and performance features of Oracle Data Pump are new capabilities that DBAs will find invaluable. A couple of prominent features are described here.
Interactive Command-Line Mode
You have much more control in monitoring and controlling Data Pump jobs with interactive command-line mode. Because Data Pump jobs run entirely on the server, you can start an export or import job, detach from it, and later reconnect to the job to monitor its progress. Here are some of the things you can do while in this mode:
- See the status of the job. All of the information needed to monitor the job’s execution is available.
- Add more dump files if there is insufficient disk space for an export file. Change the default size of the dump files.
- Stop the job (perhaps it is consuming too many resources) and later restart it (when more resources become available).
- Restart the job. If a job was stopped for any reason (system failure, power outage), you can attach to the job and then restart it.
- Increase or decrease the number of active worker processes for the job. (Enterprise Edition only.)
- Attach to a job from a remote site (such as from home) to monitor status.
Network Mode
Data Pump gives you the ability to pass data between two databases over a network (via a database link), without creating a dump file on disk. This is very useful if you’re moving data between databases, like data marts to data warehouses, and disk space is not readily available. Note that if you are moving large volumes of data, Network mode is probably going to be slower than file mode. Network export creates the dump file set on the instance where the Data Pump job is running and extracts the metadata and data from the remote instance. Network export gives you the ability to export read-only databases. (Data Pump Export cannot run locally on a read-only instance because the job requires write operations on the instance.) This is useful when there is a need to export data from a standby database.
Generating SQLFILES
In original Import, the INDEXFILE parameter generated a text file which contained the SQL commands necessary to recreate tables and indexes that you could then edit to get a workable DDL script. With Data Pump, it’s a lot easier to get a workable DDL script. When you run Data Pump Import and specify the SQLFILE parameter, a text file is generated that has the necessary DDL (Data Definition Language) in it to recreate all object types, not just tables and indexes. Although this output file is ready for execution, the DDL statements are not actually executed, so the target system will not be changed. SQLFILEs can be particularly useful when pre-creating tables and objects
in a new database. Note that the INCLUDE and EXCLUDE parameters can be used for tailoring sqlfile output. For example, if you want to create a database that contains all the tables and indexes of the source database, but that does not include the same constraints, grants, and other metadata, you would issue a command as follows:
>impdp username/password DIRECTORY=dpumpdir1 DUMPFILE=expfull.dmp SQLFILE=dpump_dir2:expfull.sql INCLUDE=TABLE,INDEX
The SQL file named expfull.sql is written to dpump_dir2 and would include SQL DDL that could be executed in another database to create the tables and indexes as desired.
Interested To learn datapump, then go through the two topics
1. Data Pump Export
2. Data Pump Import
No comments:
Post a Comment