Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

MySQL Limit performance optimization and paging data performance optimization

2025-02-27 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)06/01 Report--

MySQL Limit can query database data in segments, which is mainly used in paging. Although the website data written now are thousands of levels, some small optimizations do not play a big role, but the development should be the ultimate, the pursuit of perfect performance. Here are some limit performance optimization methods.

Limit syntax:

SELECT * FROM table LIMIT [offset,] rows | rows OFFSET offset

The LIMIT clause can be used to force the SELECT statement to return a specified number of records. LIMIT accepts one or two numeric parameters. Parameter must be an integer constant.

Given two parameters, the first parameter specifies the offset of the first returned record row, and the second parameter specifies the maximum number of record rows returned. The offset of the initial record row is 0 (not 1).

Limit # offset # syntax is supported:

Mysql > SELECT * FROM table LIMIT 5 LIMIT 10; / / retrieve record rows 6-15 FROM table LIMIT / in order to retrieve all record rows from a certain offset to the end of the recordset, you can specify the second parameter-1mysql > SELECT * record 95 LIMIT 1; / / retrieve record line 96-record if only one parameter is given, it represents the maximum number of record rows returned, in other words, record n is equivalent to LIMIT 0ql > SELECT * FROM table LIMIT 5. / / retrieve the first five record lines

Limit nrem means to select m records starting from the nth record. Most developers like to use such statements to solve the classic paging problem in Web. For small-scale data, this is not too much of a problem. For applications such as forums that may have very large-scale data, limit NJM is very inefficient. Because the data needs to be selected every time. It is very easy and easy to select only the first five records, but for 1 million records, five records starting from 800000 rows need to be scanned to this location.

That means scanning 10020 rows that meet the criteria, throwing away the previous 10000 rows and returning to the last 20 rows; this is the problem. If it is limit 100000Magne20, 100100 rows need to be scanned. In a highly concurrent application, the performance of each query needs to be scanned more than 10W, which must be greatly compromised.

Comparison of data reading efficiency with different amounts of data:

When 1.offset is relatively small:

Select * from table limit 5 and 10

Run multiple times and keep the time between 0.0004 and 0.0005

Select * From table Where id > = (Select id From table Order By id limit 10J 1) limit 10

Run several times and keep the time between 0.0005 and 0.0006. So, when offset is small, using limit directly will be more efficient!

When the 2.offset data is large:

Select * from table limit 10000 no. 10

Run many times, the time is kept at about 0.0187 seconds.

Select * From table Where id > = (Select id From table Order By id limit 10000 limit 1) limit 10

Run many times, and the time is kept at about 0.061 seconds, which is about 1x3 of the former. Therefore, when the offset is large, the use of the latter will be efficient! This is the result of indexing using id.

If you use id as the primary key of the data table:

Select id from table limit 10000,10

The query takes about 0. 04 seconds because the id primary key is used as the result of the index.

Limit performance optimization:

Select * From cyclopedia Where ID > = (Select Max (ID) From (Select ID From cyclopedia Order By ID limit 90001) As tmp) limit 100; Select * From cyclopedia Where ID > = (Select Max (ID) From (Select ID From cyclopedia Order By ID limit 9000) As tmp) limit 100

Also take 90000 records after 100, the second sentence will be faster. Because the first sentence takes the first 90001 records, takes the largest ID value as the starting mark, and then uses it to quickly locate 100 records, while the second sentence takes only the last record, and then takes the ID value as the starting mark to locate 100 records. The second sentence can be simplified as follows:

Select * From cyclopedia Where ID > = (Select ID From (Select ID From cyclopedia Order By ID limit 900hel1) As tmp) limit 100

Omitting the Max operation, the general ID is incremented.

Paging data performance optimization:

1. For a table with a larger amount of data than big data, you can establish a primary key and index fields to establish an index table, query the corresponding primary key through the index table, and query the data table of the amount of data through the primary key.

2. If you have where conditions and want to quote limit, you must design an index to put where in the first place, the primary key used by limit in the second place, and only select primary key! This can improve the reading speed.

3. Using in: first obtain the corresponding primary key value through the where condition, and then use the primary key value to query the corresponding field value.

Paging using cursor:

In order to achieve the best query performance for mysql, I changed the paging query to cursor query:

Select * from table where id > last_id limit 20 order by reply_id ASC

The last_id above is the id of the last record on this page, so that you can realize the query of "next page" as well as the query of "previous page".

Cursor paging is only suitable for sequential data and does not support paging. We can build a self-increasing ID or add ordered fields to the data table: for items with large amounts of data, paging does not have much effect, and we can use filter conditions to achieve the purpose of searching.

Summary

The above is the whole content of this article. I hope the content of this article has a certain reference and learning value for everyone's study or work. Thank you for your support. If you want to know more about it, please see the relevant links below.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report