In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-30 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
What are the three tips for using Data Pump in Oracle? for this question, this article introduces the corresponding analysis and solutions in detail, hoping to help more partners who want to solve this problem to find a more simple and feasible way.
Data pump Data Pump is a data migration product launched by Oracle from 10g to replace traditional exp/imp client tools. Compared with the exp/imp tool, the data pump has been greatly improved both in function and efficiency. This article mainly introduces several tips when using the data pump, right as a record for friends in need to check.
1. Environment introduction and construction
We choose Oracle 11gR2 to test, different from the exp/imp tool, the data pump is a tool that can only run on the server side. And the directory object needs to be created extra in the database.
[root@bspdev ~] # cd /
[root@bspdev /] # mkdir dumptest
[root@bspdev /] # ls-l | grep dumptest
Drwxr-xr-x 2 root root 4096 Aug 12 03:25 dumptest
[root@bspdev /] # chown-R oracle:oinstall dumptest/
[root@bspdev /] # ls-l | grep dumptest
Drwxr-xr-x 2 oracle oinstall 4096 Aug 12 03:25 dumptest
Create a directory object in Oracle.
SQL > show user
User is "SYS"
SQL > create directory dumptest as'/ dumptest'
Directory created
First use expdp to create a dump file as the experimental object.
[oracle@bspdev dumptest] $expdp\ "/ as sysdba\" directory=dumptest dumpfile=scott.dmp schemas=scott
Export: Release 11.2.0.1.0-Production on Mon Aug 12 03:33:36 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0-Production
(for reasons of space, there are omissions. )
Master table "SYS". "SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
*
Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
/ dumptest/scott.dmp
Job "SYS". "SYS_EXPORT_SCHEMA_01" successfully completed at 03:34:46
[oracle@bspdev dumptest] $ls-l
Total 107240
-rw-r--r-- 1 oracle oinstall 2501 Aug 12 03:34 export.log
-rw-r- 1 oracle oinstall 109694976 Aug 12 03:34 scott.dmp
2. Use SQLFILE parameter to extract DDL text of data object.
SQLFILE
Write all the SQL DDL to a specified file.
If we set the sqlfile parameter in impdp, it means that the import contents of the dump file are not imported directly into the database, but output to the text file in the form of SQL statements. Our tests are as follows:
[oracle@bspdev dumptest] $impdp\ "/ as sysdba\" dumpfile=scott.dmp content=metadata_only sqlfile=scott_ddl.sql directory=dumptest
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0-Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SYS". "SYS_SQL_FILE_FULL_01" successfully loaded/unloaded
(for reasons of space, there are omissions. )
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "SYS". "SYS_SQL_FILE_FULL_01" successfully completed at 03:39:26
In the directory directory, we see the generated SQL file.
[oracle@bspdev dumptest] $ls-l
Total 107320
-rw-r--r-- 1 oracle oinstall 2501 Aug 12 03:34 export.log
-rw-r--r-- 1 oracle oinstall 1290 Aug 12 03:39 import.log
-rw-r--r-- 1 oracle oinstall 72209 Aug 12 03:39 scott_ddl.sql
-rw-r- 1 oracle oinstall 109694976 Aug 12 03:34 scott.dmp
The segment is as follows:
-- CONNECT SYS
ALTER SESSION SET EVENTS '10150 TRACE NAME CONTEXT FOREVER, LEVEL 1'
ALTER SESSION SET EVENTS '10904 TRACE NAME CONTEXT FOREVER, LEVEL 1'
ALTER SESSION SET EVENTS '25475 TRACE NAME CONTEXT FOREVER, LEVEL 1'
ALTER SESSION SET EVENTS '10407 TRACE NAME CONTEXT FOREVER, LEVEL 1'
ALTER SESSION SET EVENTS '10851 TRACE NAME CONTEXT FOREVER, LEVEL 1'
ALTER SESSION SET EVENTS '22830 TRACE NAME CONTEXT FOREVER, LEVEL 199'
-- new object type path: SCHEMA_EXPORT/USER
-- CONNECT SYSTEM
CREATE USER "SCOTT" IDENTIFIED BY VALUES 'sCB685EC52372362B49B7EC43AB0B04BCAF2C71F283C5A558FF8E430F5365There F894844C34402B67'
DEFAULT TABLESPACE "USERS"
TEMPORARY TABLESPACE "TEMP"
-- new object type path: SCHEMA_EXPORT/SYSTEM_GRANT
GRANT EXECUTE ANY PROCEDURE TO "SCOTT"
GRANT CREATE VIEW TO "SCOTT"
In the sql text, we can find all the corresponding information and SQL statements. Of course, we can also do DDL export for some datasheet objects.
[oracle@bspdev dumptest] $impdp\ "/ as sysdba\" dumpfile=scott.dmp content=metadata_only sqlfile=scott_ddl_single_tb.sql directory=dumptest tables=scott.emp,scott.dept
Import: Release 11.2.0.1.0-Production on Mon Aug 12 03:49:07 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0-Production
(for reasons of space, there are omissions. )
Job "SYS". "SYS_SQL_FILE_TABLE_01" successfully completed at 03:49:09
[oracle@bspdev dumptest] $ls-l
Total 107332
-rw-r--r-- 1 oracle oinstall 2501 Aug 12 03:34 export.log
-rw-r--r-- 1 oracle oinstall 1031 Aug 12 03:49 import.log
-rw-r--r-- 1 oracle oinstall 9031 Aug 12 03:49 scott_ddl_single_tb.sql
-rw-r--r-- 1 oracle oinstall 72209 Aug 12 03:39 scott_ddl.sql
-rw-r- 1 oracle oinstall 109694976 Aug 12 03:34 scott.dmp
In the text, we can easily find the construction table sentence.
-- new object type path: SCHEMA_EXPORT/TABLE/TABLE
CREATE TABLE "SCOTT". "DEPT"
("DEPTNO" NUMBER (2pc0)
"DNAME" VARCHAR2 (14 BYTE)
"LOC" VARCHAR2 (13 BYTE)
) SEGMENT CREATION IMMEDIATE
PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE (INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)
TABLESPACE "USERS"
CREATE TABLE "SCOTT". "EMP"
("EMPNO" NUMBER (4. 0)
"ENAME" VARCHAR2 (10 BYTE)
"JOB" VARCHAR2 (9 BYTE)
"MGR" NUMBER (4. 0)
"HIREDATE" DATE
"SAL" NUMBER (7. 2)
"COMM" NUMBER (7. 2)
"DEPTNO" NUMBER (2. 0)
) SEGMENT CREATION IMMEDIATE
PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE (INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)
TABLESPACE "USERS"
So, what is the scenario where we use sqlfile? From the author's experience, it is very appropriate to use sqlfile in two scenarios.
First of all, there is the problem of inconsistent character set conversion. Because the data pump Data Pump runs on the server side, there are not many character set problems in principle. Unlike exp/imp, the set/export character set environment variable is required when processing. However, when importing and exporting a set of single-byte and double-byte characters, it is possible that the length of the data table does not match the data. At this point, we need to manually create a data table to adjust the field length to adapt to the change. At this point, it is convenient to use SQLFILE to generate DDL statements, which can be created directly after adjustment.
Secondly, there are differences in version features. We can see a lot of Table Annonation in the DDL of the data table, which are the property values of the data table properties. In Oracle, there are some differences in support features between different versions, such as Enterprise and Standard. Therefore, if the characteristic values of the source database do not match in the target database, you can use this parameter for DDL statement output, and then adjust the data table DDL to adapt to the target version.
SQLFILE is a very convenient tool parameter.
3. Parfile parameter file
Like exp/imp, Data Pump supports three working modes, full library mode (Full), user policy mode (Schema), and data table (Table) mode.
There are some differences in the combination parameters in different modes. Moreover, if we do complex parameter conditions, such as adding query, include, and exclude parameters, our parameter list for calling expdp/impdp is very long. At this point, it is convenient to borrow a parameter file set by key-value.
In the official document, the interpretation of parfile is as follows:
PARFILE
Specify parameter file name.
We define a text parameter file par.txt.
[oracle@bspdev dumptest] $cat par.txt
Directory=dumptest
Schemas=scott
Dumpfile=scott_par.dmp
Query=scott.emp: "where 1: 0"
The parameters are saved as key=value, and notice the difference between the query section and when we use the command line. In parfile, we don't need to pay attention to character escape.
[oracle@bspdev dumptest] $expdp\ "/ as sysdba\" parfile=par.txt
Export: Release 11.2.0.1.0-Production on Mon Aug 12 05:31:36 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0-Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYS". "SYS_EXPORT_SCHEMA_01": "/ * AS SYSDBA" parfile=par.txt
(for reasons of space, there are omissions. )
Master table "SYS". "SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
*
Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
/ dumptest/scott_par.dmp
Job "SYS". "SYS_EXPORT_SCHEMA_01" successfully completed at 05:32:05
If we carry on the data pump parameters are more complex, we can consider using parfile to save certain parameters.
4. Query parameters
Tables and Schemas are responsible for defining the scope of objects, while include and exclude are responsible for defining the types of data operations. On the other hand, the data that query is mainly responsible for defining the data table (setting conditions) will be imported and exported.
The following is the official definition of the query parameter.
QUERY
Predicate clause used to export a subset of a table.
For example, QUERY=employees: "WHERE department_id > 10".
We try to import the exported scott data into a user test that does not exist. Note: Data Pump creates objects.
[oracle@bspdev dumptest] $impdp\ "/ as sysdba\" directory=dumptest remap_schema=scott:test dumpfile=scott.dmp query=scott.t_com:\ "where 1 zero\"
Import: Release 11.2.0.1.0-Production on Mon Aug 12 04:07:58 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0-Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SYS". "SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "SYS". "SYS_IMPORT_FULL_01": "/ * AS SYSDBA" directory=dumptest remap_schema=scott:test dumpfile=scott.dmp query=scott.t_com: "where 1: 0"
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . Imported "TEST". "T_COM" 103.3 MB 0 out of 1163088 rows
(for reasons of space, there are omissions. )
Job "SYS". "SYS_IMPORT_FULL_01" completed with 1 error (s) at 04:08:03
Pay attention to a few questions:
First of all, if we set parameters on the command line, we need to pay attention to the handling of escape characters. Some objects, such as quotation marks, need to be processed using\.
In addition, if we have some data tables that we do not want to import, we can set a condition such as 1: 0 in query. In order to achieve a flexible processing scheme.
Let's take a look at the actual results and prove that the t_com under test does not have any data.
SQL > select count (*) from scott.t_com
COUNT (*)
-
1163088
SQL > select count (*) from test.t_com
COUNT (*)
-
0
Finally, we think about a question, that is, if we give scott to test users, what about test users logging in? Which password do you use?
SQL > conn test/tiger@wilson
Connected to Oracle Database 11g Enterprise Edition Release 11.2.0.1.0
Connected as test
The password is the same as scott.
Oracle's Data Pump is the data import and export tool we currently use by default. Understand the parameter list, understand the tool features, can solve a lot of problems in practical work.
This is the answer to the question about what are the three tips for using Data Pump in Oracle. I hope the above content can be of some help to you. If you still have a lot of doubts to be solved, you can follow the industry information channel to learn more about it.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 203
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.