In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)05/31 Report--
Xiaobian to share with you Mysql optimization skills Limit query example analysis, I hope you have something to gain after reading this article, let's discuss it together!
preface
In practice, paging is a common business requirement. Then we will use limit query. When we use Limit query, the efficiency is very high when the data is relatively small or only the first part of the data is queried. However, when the amount of data is large, or when the number of query offsets is relatively large, such as: limit 100000,20 efficiency is often unsatisfactory. A common approach is to Limit with order by, if order by has an index to the user, the efficiency is usually relatively good.
In this case, the simplest query is to use an overlay index to query some of the desired columns. This effect is very good
like this one below.
mysql> SELECT * FROM student LIMIT 1000000,1;+---------+------------+------------+------------+-------+---------------------+| id | first_name | last_name | created_at | score | updated_at |+---------+------------+------------+------------+-------+---------------------+| 1000001 | kF9DxBgnUi | yLXnPSHJpH | 2019-07-11 | 97 | 2019-07-11 14:29:59 | |+---------+------------+------------+------------+-------+---------------------+1 rows in set (0.31 sec)
can see the time
mysql> EXPLAIN SELECT score,first_name FROM student ORDER BY created_at LIMIT 1000000,20 \G*************************** 1. row *************************** id: 1 select_type: SIMPLE table: student partitions: NULL type: indexpossible_keys: NULL key: time_sorce_name key_len: 69 ref: NULL rows: 1000001 filtered: 100.00 Extra: Using index1 row in set, 1 warning (0.00 sec)mysql>
In this case, the query column uses the overlay index, and the number of scan rows will be reduced a lot, but the effect is not very satisfactory, but if there are other queries, such queries will also become very slow.
For example, we add the last_name column.
as follows
mysql> SELECT score,first_name,last_name FROM student ORDER BY created_at LIMIT 1000000,1;+-------+------------+------------+| score | first_name | last_name |+-------+------------+------------+| 86 | knKsV2g2fY | WB5qJeLZuk |+-------+------------+------------+1 row in set (4.81 sec)mysql>
This query takes more than 4 seconds to execute. Through analysis, you can see that there is no way to use this query index
mysql> explain SELECT score,first_name,last_name FROM student ORDER BY created_at LIMIT 1000000,1\G*************************** 1. row *************************** id: 1 select_type: SIMPLE table: student partitions: NULL type: ALLpossible_keys: NULL key: NULL key_len: NULL ref: NULL rows: 6489221 filtered: 100.00 Extra: Using filesort1 row in set, 1 warning (0.00 sec)mysql>
So we now modify the query as follows
mysql> SELECT student.score,student.first_name FROM student INNER JOIN (SELECT id FROM student ORDER BY created_at LIMIT 1000000,1 ) AS temp USING(id);+-------+------------+| score | first_name |+-------+------------+| 15 | 2QWZ |+-------+------------+1 row in set (0.18 sec)mysql> EXPLAIN SELECT student.score,student.first_name,last_name FROM student INNER JOIN (SELECT id FROM student ORDER BY created_at LIMIT 1000000,1 ) AS temp USING(id);+----+-------------+------------+------------+--------+---------------+-----------------+---------+---------+---------+----------+-------------+| id | select_type | table | partitions | type | possible_keys | key | key_len | ref | rows | filtered | Extra |+----+-------------+------------+------------+--------+---------------+-----------------+---------+---------+---------+----------+-------------+| 1 | PRIMARY | | NULL | ALL | NULL | NULL | NULL | NULL | 1000001 | 100.00 | NULL || 1 | PRIMARY | student | NULL | eq_ref | PRIMARY | PRIMARY | 4 | temp.id | 1 | 100.00 | NULL || 2 | DERIVED | student | NULL | index | NULL | time_sorce_name | 69 | NULL | 1000001 | 100.00 | Using index |+----+-------------+------------+------------+--------+---------------+-----------------+---------+---------+---------+----------+-------------+3 rows in set, 1 warning (0.00 sec)
The analysis results show that only 1000001 data records were queried at this time. Why is there such a change? This is called delayed association, which first returns the required primary key by using the overlay index query, and then obtains the required data according to the primary key association original table, reducing the number of rows to be scanned as much as possible.
In some cases, there is another optimization. For example, to get the latest insert records. Then we can record the primary key ID(last_id) of the last record in the last query.
Then the query can be changed to
SELECT score,first_name,last_name,id FROM student WHERE id>=last_id ORDER BY id ASC LIMIT 1
For example, last_id=1000000, then the query will start at 1000000. This scenario will perform well regardless of the offset of the data.
After reading this article, I believe you have a certain understanding of "Mysql Optimization Skills Limit Query Sample Analysis". If you want to know more about it, welcome to pay attention to the industry information channel. Thank you for reading!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.