In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-15 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)06/01 Report--
Mysql limit pagination statement usage
Compared to Oracle and MS SqlServer, mysql's pagination approach is so simple it makes you cry.
--Grammar:
SELECT * FROM table LIMIT [offset,] rows | rows OFFSET offset
- Examples:
select * from table limit 5; --Returns the first 5 lines select * from table limit 0,5; --Same as above, returns the first 5 lines select * from table limit 5,10; --Returns lines 6-15
How to optimize limits
When the offset of a query statement is very large, such as select * from table limit 10000,10 , it is best not to use limit directly, but to obtain the id of offset first, and then directly use limit size to obtain data. It's gonna be a lot better.
For example:
select * From customers Where customer_id >=(select customer_id From customers Order By customer_id limit 10000,1) limit 10;
I. Test experiment
mysql pagination directly with limit start, count pagination statement:
select * from product limit start, count
When the start page is small, there is no performance problem with the query. Let's look at the execution time of paging from 10, 100, 1000, and 10000 (take 20 entries per page), as follows:
select * from product limit 10, 20 0.016 s select * from product limit 100, 20 0.016 s select * from product limit 1000, 20 0.047 s select * from product limit 10000, 20 0.094 s
We have seen that as the number of starting records increases, the time increases, which shows that the paging statement limit has a lot to do with the number of starting pages.
So let's change the starting record to 40w and see.
select * from product limit 400000, 20 3.229 seconds
And look at the time we took the last page.
select * from product limit 866613, 20 37.44 seconds
Such pagination of the largest number of pages obviously this time is intolerable.
From this we can also conclude two things:
The query time of the limit statement is proportional to the position of the starting record. mysql's limit statement is handy, but it's not suitable for direct use with record-heavy tables.
II. Performance optimization method for limit paging problem
2.1 Speed up paging queries by using overlay indexes on tables
As we all know, queries using indexes are quick if they contain only that index column (overriding the index).
Because there is an optimization algorithm for index lookup, and the data is on the query index, there is no need to find the relevant data address, which saves a lot of time.
In addition, Mysql also has a related index cache, which is better when the concurrency is high.
In our example, we know that the id field is a primary key, and naturally includes the default primary key index. Now let's see how queries with overlay indexes work:
This time we query the last page of data (using an overlay index, containing only the id column), as follows:
select id from product limit 866613, 20
The query time is 0.2 seconds, which is about 100 times faster than 37.44 seconds for querying all columns.
So if we want to query all columns as well, there are two ways to do that,
id>= form:
SELECT * FROM product WHERE ID > =(select id from product limit 866613, 1) limit 20
The query time is 0.2 seconds, which is a qualitative leap.
Use of join
SELECT * FROM product a JOIN (select id from product limit 866613, 20) b ON a.ID = b.id
The search time is also very short, great!
In fact, both of them use the same principle, so the effect is similar.
The above is all the content of this article, I hope to help everyone's study, but also hope that everyone a lot of support.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.