In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/03 Report--
This article is about how to run out of memory caused by php querying a large amount of data in mysql. The editor thinks it is very practical, so share it with you as a reference and follow the editor to have a look.
The specific analysis is as follows:
First, problems
When you use php to query a large amount of mysql data, the program has not finished execution. A warning pops up:
Fatal error: Allowed memory size of 100663296 bytes exhausted (tried to allocate 100663296 bytes)
Error tip: the 100m memory allocated by php is occupied.
2. Solution:
The simplest solution is to add to the header of the execution file:
Ini_set ('memory_limit','256M')
Increase the memory to 256m or more to increase the memory space used by php
But what to do next time if you want to read more data, you can't increase it again and again, causing the server to run out of memory by php.
Here is a function:
Use the memory_get_usage () method to get the amount of memory used by php. It is found that with the increase of the number of read data, the memory used by php increases step by step.
Is the data stored in memory when php queries mysql? After searching for it, I found that it was probably what it meant.
The C API functions of mysql include mysql_use_result () and mysql_store_result ()
Mysql_store_result () reads the result set from mysqlServer to the client, while mysql_use_result () just reads the meta-information of the result set
1. The mysql_query of php calls mysql_store_result (), which automatically fetches and caches the result set.
2. Another function of php, mysql_unbuffered_query (), is called mysql_use_result (), which, on the one hand, saves considerable memory when dealing with large result sets. On the other hand, you can manipulate the result set as soon as the first row is fetched without waiting for the entire SQL statement to be executed.
So when we read a lot of data, we can use mysql_unbuffered_query () instead of mysql_query (). After testing, it is true. And quite powerful, after all the data memory has been kept within the 1MB, there is no growth.
Mysql_unbuffered_query () sends a SQL query query to MySQL, but does not automatically fetch and cache the result set as mysql_query () does. On the one hand, this saves considerable memory when dealing with large result sets. On the other hand, you can manipulate the result set as soon as the first row is fetched without waiting for the entire SQL statement to be executed. When using multiple database connections, you must specify the optional parameter link_identifier.
The benefits of mysql_unbuffered_query () come at a cost: mysql_num_rows () and mysql_data_seek () cannot be used on top of the result set returned by mysql_unbuffered_query (). In addition, before sending a new SQL query to MySQL, you must extract the resulting rows from all uncached SQL queries.
So be sure to select the function properly according to your own business needs.
Thank you for reading! This is the end of the article on "how to run out of memory caused by php query mysql large amount of data". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, you can share it out for more people to see!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.