Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to optimize SQL statements when MySQL inserts data in bulk

2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)05/31 Report--

This article mainly explains how to optimize SQL statements when MySQL inserts data in large quantities. The content of the explanation in the article is simple and clear, and it is easy to learn and understand. Please follow Xiaobian's train of thought to study and learn how to optimize SQL statements when MySQL inserts data in large quantities.

First, for Myisam-type tables, you can quickly import a large amount of data in the following ways.

ALTER TABLE tblname DISABLE KEYS

Loading the data

ALTER TABLE tblname ENABLE KEYS

These two commands are used to turn on or off updates to non-unique indexes of the Myisam table. When importing a large amount of data into a non-empty Myisam table, you can improve the efficiency of import by setting these two commands. For importing a large amount of data into an empty Myisam table, the default is to import the data before creating the index, so there is no need to set it up.

Second, for Innodb-type tables, this approach does not improve the efficiency of importing data. For tables of type Innodb, we have the following ways to improve the efficiency of import:

① because the tables of Innodb type are saved in the order of primary keys, so arranging the imported data according to the order of primary keys can effectively improve the efficiency of importing data. If the Innodb table does not have a primary key, the system creates an internal column as the primary key by default, so if you can create a primary key for the table, you can take advantage of this advantage to improve the efficiency of importing data.

② executes SET UNIQUE_CHECKS=0 before importing data, turns off uniqueness check, executes SET UNIQUE_CHECKS=1 after import, and resumes uniqueness check, which can improve the efficiency of import.

If ③ uses autocommit, it is recommended to execute SET AUTOCOMMIT=0 before import, disable autocommit, execute SET AUTOCOMMIT=1 after import is over, and enable autocommit, which can also improve the efficiency of import.

Thank you for your reading, the above is the content of "how to optimize SQL statements when MySQL inserts data in large quantities". After the study of this article, I believe you have a deeper understanding of how to optimize SQL statements when MySQL inserts data in large quantities. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report