Import can also be used to load a target database directly from a source database with no intervening dump files. This allows export and import operations to run concurrently, minimizing total elapsed time. This is known as a network import.
Data Pump Import enables you to specify whether a job should move a subset of the data and metadata from the dump file set or the source database (in the case of a network import), as determined by the import mode. This is done using data filters and metadata filters, which are implemented through Import commands
Invoking Data Pump Import
The Data Pump Import utility is invoked using the impdp command. The
characteristics of the import operation are determined by the import parameters you specify. These parameters can be specified either on the command line or in a parameter file.
Data Pump Import Interfaces
You can interact with Data Pump Import by using a command line, a parameter file, or an interactive-command mode.
1.Command-Line Interface: Enables you to specify the Import parameters directly on the command line. For a complete description of the parameters available in the command-line interface, see Parameters Available in Import's Command-Line Mode
Parameter | Description | Default |
attach | connects a client session to a current running data pump import job | |
content | filters what is import: data_only, metadata_only , all | all |
directory | destination of log and dump files | data_pump_dir |
dumpfile | name of the dump file | |
estimate | method used to estimate dump file size : blocks or statistics | |
estimate_only | Y/N. instruct data_pump whether data should be imported or estimated. | N |
exclude | exclude objects and data from being imported. | |
flashback_scn | scn of the database to flash back to to during imported | |
full | Y/N. import all data and metadata in a full mode import | N |
help | Y/N. display a list of avillable commands and options | N |
include | specify which objects and data will be exported | |
job_name | name of the job | system generated |
logfile | name of log file | expdp.log |
network_link | source database link for a data pump job importing a remote database | |
nologfile | a Y/N flag used to suppress log file creation | |
parallel | set number of workers for the export job | 1 |
parfile | name the parameter file to use | |
query | filters row from table during the import | |
schemas | name the schema to be exported for schema mode import | |
remap_datafiles | change the name fo source datafiles | |
remap tablespace | change the name fo source tablespace | |
remap_schema | import data into a new schema | |
skip_unusable_indexes | a N/Y flag | |
sql file | name of file in which DDL of the import will be write | |
table_exists_action | instruct import how to proceed if the table being imported is exist. Values include skip,append,truncate and replace | skip |
status | displays detailed status of the data pump job | |
tables | list of tables and partitions to be exported for table mode import | |
tablespace | list of tablespaces to import in tablespace mode | |
transport_full_check | Specifies whether the tablespaces being imported should be verified as a self-contained set. | |
transport_tablespaces | specifies a transportable tablespace mode import | |
transport_datafiles | list of datafiles to be imported during a transportable tablespace mode import | |
version | specifies the version of the database. |
Please go part#2
No comments:
Post a Comment