In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-04 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly explains "how to achieve parallel export and import without landing in PostgreSQL". Interested friends may wish to have a look. The method introduced in this paper is simple, fast and practical. Next, let the editor take you to learn "how to achieve parallel export and import without landing in PostgreSQL"!
Version 9.4 below, using pg_dump parallel export, pg_restore parallel import, migration
(the source version of pg_dump is used for export and the target version of pg_restore is used for import. If it is ppas, use the corresponding version of enterprisedb.)
1. (source database) global metadata (user, tablespace) export
Superuser permission is required (if you do not have this permission, skip this step, but be sure to artificially create all users related to object permissions in the target instance when performing the next step).
Pg_dumpall-g-h IP address-p port-U user-W-l database name
2. (target library) global metadata import
Import the above metadata and execute it in the target library (usually including creating users, changing user passwords, creating tablespaces, etc.). )
The purpose of step 2 is to ensure that the target user already exists when you perform grant, alter set owner and other operations when you import, otherwise missing users will cause pg_restore errors.
3. (target library) build a database
Postgres=# create database newdb; CREATE DATABASE
4. (target library) plug-in
When installing the postgresql software, package the plug-ins that are already used by the source library. Slightly
5. (source database) export
Mkdir / data01/pg/backup pg_dump-j 32-f / data01/pg/backup-F d-h IP address-p port-U user-W newdb
6. (target instance) close autovacuum and accelerate import (optional)
Use superuser to execute SQL alter system set autovacuum=off; select pg_reload_conf ()
7. (target library) import
Pg_restore-d newdb-F d-j 32-h IP address-p port-U user / data01/pg/backup
8. (target instance) enable autovacuum (if 6 is executed)
Use superuser to execute SQL alter system set autovacuum=on; select pg_reload_conf ()
9. (target instance) collect statistics (if 6 is performed)
Use superuser to execute SQL and collect statistics\ c newdb superuser analyze
Using the above method, the database migration of 60GB takes about 10 minutes.
Multi-machine game Where this gets interesting is with multiple hosts. You can:$ # dump a remote database to your local machine$ pg_dump-h remotedb.mydomain.com-f / home/postgres/dump.sql test $# dump a local database and write to a remote machine$ pg_dump-h remotedb.mydomain.com test | ssh postgres@remotedb.mydomain.com 'cat > dump.sql' $# dump a remote database and write to the same remote machine$ pg_dump-h remotedb.mydomain.com test | ssh postgres@remotedb.mydomain.com' cat > dump.sql' $# or a different remote machine$ Pg_dump-h remotedb1.mydomain.com test | ssh postgres@remotedb2.mydomain.com 'cat > dump.sql' You also have similar restore options. I will use psql below but pg_restore works the same:$ # dump a remote database and restore to your local machine$ pg_dump-h remotedb.mydomain.com test1 | psql test2 $# dump a local database and restore to a remote machine$ pg_dump-h remotedb.mydomain.com test | ssh postgres@remotedb.mydomain.com 'psql test' $# dump a remote database and restore to the same remote machine$ pg_dump-h remotedb.mydomain.com test1 | ssh postgres@remotedb.mydomain.com' psql test2' $# or a different remote machine$ pg _ dump-h remotedb1.mydomain.com test | ssh postgres@remotedb2.mydomain.com 'psql test' to this point I believe that everyone has a deeper understanding of "how to achieve parallel export and import without landing in PostgreSQL", so you might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.