In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-21 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)06/01 Report--
I. advantages of data pump technology
The original export and import technologies are based on clients, while data pump technology is based on servers. By default, all dumps, logs, and other files are established on the server. The following are the main advantages of data pump technology:
1. Improved performance
2. The ability to restart the job
3. The ability of parallel execution
4. Associate the ability to run jobs
5. The ability to estimate space requirements
6. Grid mode of operation
7. Fine-grained data import function
8. Remapping ability
Second, the use of data pump export and import
1. Transfer data from the development environment to the test environment or product environment
2. Transfer data from the development environment to the test environment or the product environment
3. Make a backup before modifying the important table
4. Back up the database
5. Move database objects from one table space to another
6. Migrate tablespaces directly to the database
7. Extract the DDL of a table or other object
Note: the database does not establish a complete backup because there is no up-to-date data in the export file in the event of a disaster. However, for smaller databases and individual tablespace exports, data export is still a viable backup tool.
Third, the components of the data pump
Data pump technology is mainly composed of three parts:
The DBMS_DATAPUMP package. This is the main engine that drives the loading and unloading of data field metadata. The DBMS_DATAPUMP package includes the core of data pump technology, which appears in the form of a process that actually drives data loading and unloading.
DBMS_METADATA . To extract and modify metadata, Oracle provides the DBMS_METADATA package.
command line client. Two utilities, expdp and impdp, for export and import
4. Data pump file
For data pump dump files, you can use three types of files:
dump file: this file stores the actual raw data
log files: recording messages and results of data pump operations
SQL file: the data pump uses a special parameter, sqlfile, to write all the DDL statements to be executed in the import job to a file. The data pump does not actually execute the SQL, it simply writes the DDL statement to the file specified by the SQLFILE parameter.
5. Operation
Permissions for opening / opt folders in chmod-R 777 / opt / / root mode
Su-oracle / / switch oracle user
Mkdir / opt/app/bak / / create a data pump backup file directory
Sqlplus / as sysdba / / Log in as an administrator
SQL > create directory dump_dir as'/ opt/app/bak'; # create a new directory object dump_dir and map the directory "/ opt/app/bak"
SQL > create user c##tom identified by abc123 # create "c##tom" test user and authorize
2 default tablespace users
3 temporary tablespace temp
4 quota unlimited on users
Grant connect,resource,dba to cantilever administrator; # authorize connections, manage databases, and dba administrator privileges
Grant read,write on directory dump_dir to caterpillar; # authorizes users to read and write to the directory directory
Conn c##tom/abc123 # switching c##tom users
Create test data with tom users: info,grade two tables
Create table info
(
Id number (5)
Name varchar (10)
);
Insert into info values (1 minute Zhangsan'); # insert any piece of data
Create table grade
(
Id number (5)
Name varchar (10)
Score varchar (5)
);
Insert into grade values (1) insert a piece of data into table grade
Export data
Expdp c##tom/abc123 directory=dump_dir dumpfile=tom.dmp tables=info,grade
# Export the table under the c##tom user
Parsing:
Directory # specifies the alias name of its path map and the path to the exported data
Dumpfile # specifies the name of the dump file. The default name is expdat.dmp
Tables # specify table schema export
View the exported data file: ls / opt/app/bak/
Import data
Sqlplus c##scott/abc23 # Log in with the "tom" user
SQL > drop table info; # simulated failure to delete one of the tables
Restore data:
Impdp c##tom/tom123 directory=dump_dir dumpfile=tom.dmp tables=info # restore the info table and its data
Sqlplus c##tom/tom123 # Log in to the "scott" user
SQL > select table_name from user_tables; # to see if the recovery is successful
VI. Summary
What you should pay attention to when using EXPDP and IMPDP in data pump:
1. EXP and IMP are client-side tool programs that can be used either on the client side or on the server side.
2. EXPDP and IMPDP are server-side tools, which can only be used on the ORACLE server, not on the client side.
3. IMP only applies to EXP exported files, not EXPDP exported files; IMPDP only applies to EXPDP exported files, not EXP exported files.
4. When using the expdp or impdp command, you can not specify the user name / password @ instance name as identity for the time being, and then enter it according to the prompt, such as:
Expdp schemas=scott dumpfile=expdp.dmp DIRECTORY=dir
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.