Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Example Analysis of Mysql million-level data Migration

2025-01-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)05/31 Report--

This article will explain in detail the example analysis of Mysql million data migration. The editor thinks it is very practical, so I share it with you for reference. I hope you can get something after reading this article.

If there is such a scenario, we develop a Mini Program and take advantage of the popularity of the Singles Day activity to quickly accumulate more than 1 million users a month later. We have added a formid collection site on the Mini Program page, which is used to send template message notifications to Wechat users.

With the increase of the amount of data, the previously used server space began to run out of space. Recently, I wrote a new framework specifically for background development of Mini Program, so I wanted to migrate the original data to the database of the new system. Bought a 4-core 8G machine and began to do data migration. The following is a simple record of the migration process.

Choose mysqldump Migration as the solution

In normal development, the data backup migration method that we often use is to export a sql file with mysqldump tool, and then import sql into the new database to complete the data migration.

The experiment found that it takes a few minutes to export a million-level database into a sql file through mysqldump, and the size of the exported sql file is about 1G, and then it takes a few minutes to copy the 1G sql file to another server through the scp command. I used the source command to import data in the database of the new server. I ran all night without importing the data, and the cpu was full.

Script migration

It is a relatively convenient way to export and import data directly through the command line operation database, but when the amount of data is large, it is often time-consuming and requires high performance of the server. If the data migration time requirement is not very high, you can try to write a script to migrate the data. Although I didn't actually try it, I thought there were probably two scripting schemes.

The first way is to run a migration script in the migration target server, remotely connect to the database of the source data server, set query conditions, read the source data in blocks, and write to the target database after reading. This kind of migration may be inefficient, and data export and import are equivalent to a synchronous process that needs to be read before it can be written. If the query conditions are designed reasonably, multiple migration scripts can also be started in a multi-threaded way to achieve the effect of parallel migration.

The second way, you can combine redis to build a "production + consumption" migration plan. As a data producer, the source data server can run a multi-threaded script on the source data server, read the data in the database in parallel and write the data to the redis queue. As a consumer, the target server also runs a multithreaded script on the target server, connects to redis remotely, reads the data in the redis queue in parallel, and writes the read data to the target database. Compared with the first method, this method is an asynchronous scheme, data import and data export can be carried out at the same time, through the redis as the data transfer station, the efficiency will be greatly improved.

Here you can also use go language to write migration scripts, using its native concurrency feature, you can achieve the purpose of parallel migration of data and improve the efficiency of migration.

File migration

The efficiency of the first migration scheme is too low, and the coding cost of the second migration scheme is high. Through comparison and analysis of the data found on the Internet, I finally chose to pass the mysql.

Select data into outfile file.txt 、 load data infile file.txt into table

To complete the migration of millions of data in the form of import and export files.

Migration process

Export data files in the source database

Select * from dc_mp_fans into outfile'/ data/fans.txt'

Copy data files to the target server

Zip fans.zip / data/fans.txtscp fans.zip root@ip:/data/

Import files in the target database

Unzip / data/fans.zipload data infile'/ data/fans.txt' into table wxa_fans (id,appid,openid,unionid,@dummy,created_at,@dummy,nickname,gender,avatar_url,@dummy,@dummy,@dummy,@dummy,language,country,province,city,@dummy,@dummy,@dummy,@dummy,@dummy,@dummy,@dummy,@dummy,@dummy)

Following these steps, the cross-server migration of a million-level data table is completed in a few minutes.

Attention item

Mysql Security item Settings

To execute load data infile and into outfile commands in mysql, you need to enable the secure_file_priv option in mysql. You can check whether mysql has this option enabled through show global variables like'% secure%';. The default value, Null flag, does not allow import and export commands to be executed.

Modify the mysql configuration item through vim / etc/my.cnf and set the value of secure_file_priv to empty:

[mysqld] secure_file_priv=''

You can import and export data files through the command.

Imported and exported datasheet fields do not correspond

The above example migrates data from the dc_mp_fans table of the source database to the wxa_ fans table of the target database. The fields of the two data tables are: dc_mp_fans.

Wxa_fans

When importing data, you can set the field name to match the data of the target field, and you can discard the unwanted target field data through @ dummy.

This is the end of this article on "sample Analysis of Mysql million data Migration". I hope the above content can be helpful to you, so that you can learn more knowledge. if you think the article is good, please share it for more people to see.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report