In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article introduces the relevant knowledge of "how to solve the computer jam caused by php-cgi accounting for 100% of memory in Linux". In the operation of practical cases, many people will encounter such a dilemma, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!
Site log directory / home/hosts_log
The log file has the corresponding website id
Php-cgi log directory / usr/local/php_fcgi/logs
Php-fpm.log
Slow.log
Viewing the log file can basically solve the problem. The website has been hacked and uploaded a php ddos file. Of course, if it is not for this problem, you can check whether it is a program problem. Sometimes using the file_get_contents function may also cause cpu 100%.
In php.ini, there is a parameter, max_execution_time, that sets the maximum execution time of the PHP script, but in php-cgi (php-fpm), this parameter has no effect. What really controls the maximum execution time of PHP scripts are the following parameters in the php-fpm.conf configuration file:
The timeout (in seconds) for serving a single request after which the worker process will be terminated
Should be used when 'max_execution_time' ini option does not stop script execution for some reason
'0s' means' off'
"value name=" request_terminate_timeout "" 0s "/ value"
The default value is 0 seconds, that is, the PHP script will be executed forever. In this way, when all php-cgi processes are stuck in the file_get_contents () function, the WebServer of this Nginx+PHP can no longer handle new PHP requests, and Nginx will return "502 Bad Gateway" to the user. It is necessary to modify this parameter to set the maximum execution time of a PHP script, but it does not cure the symptoms. For example, if it is changed to "value name=" request_terminate_timeout "30s" / value ", if file_get_contents () is slow to get web content, this means that there are 150 php-cgi processes that can only handle 5 requests per second, and it is also difficult for WebServer to avoid" 502 Bad Gateway ".
To achieve a complete solution, we can only change the habit of using file_get_contents directly, but modify it slightly, add a timeout, and implement the HTTP GET request in the following ways. If you find it troublesome, you can encapsulate the following code into a function.
"? Php
$ctx = stream_context_create (array (
'http' = "array (
'timeout' = "1 / / sets a timeout in seconds
)
)
)
File_get_contents ("http://www.111cn.net/", 0, $ctx)
? "
Of course, if like dedecms generates html pages, cpu will also be 100% of the situation.
This is the end of the content of "how to solve the computer stuck because php-cgi accounts for 100% of memory in Linux". Thank you for your reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.