Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Common parameters of EXPDP/IMPDP in ORACLE database

2025-03-30 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)06/01 Report--

This paper mainly introduces how to import and export Oracle database by using EXPDP/IMPDP, that is, data pump.

Guide library rights management

Backup and restore database can be set by a special user to grant export import permission, export: exp_full_database import: imp_full_database permission. (system is used later in this article and has permissions by default.)

Check whether scott has import and export permissions

Select * from dba_role_privs where grantee='SCOTT'

Give scott users import and export permissions

Grant exp_full_database,imp_full_database to scott

Recall scott import and export permissions

Revoke select on dba_directories from scott

Directory management

Before introducing Oracle import and export, let's introduce the directory directory. Directory is mainly used to specify the storage path. View the DBA_directories view to see the catalog information. Here I also briefly introduce the administrative commands of the directory directory.

Create a directory named zhanky, corresponding to the path G:\ zhanky

Create or replace directory zhanky as'G\ zhanky'

View all directory

Select * from dba_directories

Delete the specified directory

Drop directory zhanky

In order to make it easier for you to understand, we simply make the corresponding diagram of DIRECROEY and EXPDP/IMPDP, and directory is created to reference when importing everywhere.

You need to make sure that the export user has exp_full_database permission (if you use system or sys by default), and the import user requires imp_full_database permission. Then you need to create a directory to hold the backup files.

Let's get down to business, and the date of import and export is often understood by parameters through testing.

EXPDP export

Commonly used daily backup, export the whole library to the zhanky directory, backup files named zhanky.dmp, log files named zhanky_out.log.

Expdp system/manager@zhanky directory=zhanky dumpfile=zhanky.dmp logfile=zhanky_out.log full=y

DIRECTORY: the directory object used to dump files and log files.

DUMPFILE: specifies the name of the exported backup file.

LOGFILE: specifies the name of the exported backup log. The information in the backup is recorded in it.

FULL: export the entire database (the default is N, which means that only all data of the logged-in user is exported by default).

Export 40% of the data under sh and scott users except sales and emp tables to the zhanky directory, and the backup file is named zhanky2.dmp.

Expdp system/manager@zhanky directory=zhanky dumpfile=zhanky2.dmp schemas=sh,scott exclude=table:\ "IN\ (\ 'SALES\',\ 'EMP\')\" sample=40

SCHEMAS: list of scenarios to export [login scheme].

EXCLUDE: excludes specific object types. (table name should be capitalized)

SAMPLE: the percentage of data to be exported.

Export the users and example tablespaces to the zhanky directory with version 11.2.0.1. The export uses two process backups, and the files are named zhanky1.dmp and zhanky2.dmp, respectively. If the file exists, the target dump file is overwritten directly.

Expdp system/manager@zhanky directory=zhanky dumpfile=zhanky1.dmp,zhanky2.dmp tablespaces=users,example version=11.2.0.1 parallel=2 reuse_dumpfiles=y

TABLESPACES: identifies the list of tablespaces to export.

VERSION: specifies the version of the exported database, which is generally used when the data used in the high-version database is imported into the lower-version database.

PARALLEL: changes the number of active worker for the current job.

REUSE_DUMPFILES: overwrite the target dump file (if the file exists) [N].

Export the data with the cust_id column value of 987 in the sh.selas table to the zhanky directory, the backup file is named zhanky3.dmp, and the backup task is called zhankycs

Expdp system/manager@zhanky directory=zhanky dumpfile=zhanky3.dmp tables=sh.selas QUERY=selas: "where cust_id=987" JOB_NAME=zhankycs

TABLES: identifies the list of tables to export. For example, TABLES=HR.EMPLOYEES,SH.SALES:SALES_1995.

QUERY: predicate clause used to export a subset of a table. For example, QUERY=employees: "WHERE department_id > 10".

JOB_NAME: the name of the export job to be created.

More export parameters can be obtained through expdp-hlep

IMPDP Import

The general process of IMPDP: first validate user name, password, library and other parameter information, then create tablespace, user, role, etc., and finally import table, index and other objects. The most important thing in this step is to create a tablespace user. If the tablespace creation or user fails, the tablespace or user's data will be lost in the new library. The reason for the failure to create a user is that the tablespace does not exist, so the success of the tablespace creation is important. The parameter information in the backup file (that is, the backup file data source library) is followed when the tablespace is created automatically during the import process. If all the parameters are correct, the only thing that can go wrong is the storage space and the path. So we just need to make sure that the server space is sufficient and that the path to the data file in the source library exists on the server. In the case of a single instance environment being imported from windows to linux, the path cannot be guaranteed to be consistent, so it is recommended to create a tablespace before import. Generally speaking, if it is a production environment, the company will have a script to create tablespaces.

For daily recovery, import all the data from the zhanky.dmp backup file in the zhanky directory into the zhanky library and export the log named zhanky_in.log.

Impdp system/manager@zhanky directory=zhanky dumpfile=zhanky.dmp logfile=zhanky_in.log full=y

DIRECTORY the directory object used by dump files, log files, and sql files.

List of dump files that DUMPFILE wants to import from (expdat.dmp)

LOGFILE log file name (import.log).

FULL imports all objects (Y) from the source.

Import all data except sales and emp tables from sh and scott users in the zhanky.dmp backup file under the zhanky directory into the zhanky library. And name the import job zhankycs_in

Impdp system/manager@zhanky directory=zhanky dumpfile=zhanky.dmp schemas=sh,scott exclude=table:\ "IN\ (\ 'SALES\',\ 'EMP\')\" job_name=zhankycs_in

A list of scenarios to be imported by SCHEMAS.

EXCLUDE excludes specific object types, such as EXCLUDE=TABLE:EMP.

The name of the import job to be created by JOB_NAME.

Import the users and example tablespaces from the zhanky1.dmp and zhanky2.dmp backup files under the zhanky directory into the zhanky library, using two processes to import.

Expdp system/manager@zhanky directory=zhanky dumpfile=zhanky1.dmp,zhanky2.dmp tablespaces=users,example reuse_datafiles=y parallel=2

TABLESPACES identifies the list of tablespaces to import.

REUSE_DATAFILES initializes the tablespace if it already exists (N)

PARALLEL changes the number of active worker for the current job.

Import the data in the sh.selas table cust _ id column in the zhanky.dmp backup file under the zhanky directory with a value of 987 as version 10.2.0.4 into the zhanky library, and append the data to the table if you specify the sh.selas table

Impdp system/manager@zhanky directory=zhanky dumpfile=zhanky.dmp tables=sh.selas QUERY=selas: "where cust_id=987" version=10.2.0.4 TABLE_EXISTS_ACTION=REPLACE

The predicate clause used by QUERY to import a subset of the table.

The version of the object to be exported by VERSION, where the valid keywords are:

TABLES identifies the list of tables to import.

The action performed when the TABLE_EXISTS_ACTION import object already exists. Valid keywords: (SKIP) Skip, APPEND append, REPLACE substitution, and TRUNCATE are added after clearing the table.

Import all the data from the zhanky.dmp backup file under the zhanky directory into the zhanky library, but import the data from the users table space into the cs table space, the data from the scott user into the zhanky user, and the data from the sh.selas table into the system.cstable.

Impdp system/manager@zhanky directory=zhanky remap_tablespace=users:cs remap_schema=scott:zhanky remap_table=sh.selas:cstable full=y

REMAP_TABLESPACE remaps tablespace objects to another tablespace.

REMAP_SCHEMA loads objects from one scheme into another.

REMAP_TABLE remaps the table name to another table. For example, REMAP_TABLE=EMP.EMPNO:REMAPPKG.EMPNO.

In fact, the import and export of the data pump is very simple, mainly to understand the use of each parameter, and then a variety of collocation. That's all for this time. If you have any questions, please feel free to leave a message.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report