In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
Today, I will talk to you about how to configure uwsgi service to do caching in Nginx. Many people may not know much about it. In order to make you understand better, the editor has summarized the following for you. I hope you can get something according to this article.
Why is the cache set at the nginx layer
Caching is necessary, so here's why some requests are cached at the nginx layer rather than the service application layer. The reasons are as follows:
Generally speaking, nginx, as a proxy server, sets the cache in nginx, which can save the time of forwarding a request in the case of * cache.
Nginx itself is implemented in C language and has higher performance than general languages, especially dynamic languages (python, etc.). Therefore, nginx reads the cache faster than the back-end application.
Caching is not introduced in history projects, and adding caching configurations in nginx is faster than writing application code.
Of course, the interface that requires the application to authenticate user rights is not suitable to set the cache at the nginx layer.
Configuration method
There are many service protocols that nginx can proxy (such as http, uwsgi, fastcgi). Here we take the configuration of uwsgi protocol as an example. The basic purpose of other protocols is to replace the uwsgi before cache in the configuration instruction with the corresponding text of the protocol (for example, http corresponds to proxy, fastcgi corresponds to fastcgi). Here first a basic configuration file, and then the key configuration instructions one by one to describe its configuration method and function.
In the above configuration files, the words "cache" are all cache-related configuration instructions. Their functions and configuration methods are as follows:
1.uwsgi_cache_path to configure the location, naming, and directory allocation of cached content. / tmp/nginx is the location where the cache files are stored; the value corresponding to keys_zone is the name and size of the cache space, where the name is myapp and the size is 128m The levels parameter is the subdirectory where the cache file is stored. 1:2 means to take the MD5 value of the cache key (uwsgi_cache_key) as the first-level directory and the penultimate second and third letter as the secondary directory. In the above configuration, the cache with the MD5 value of 4897858cede04cdd6676d87fd9e9163e of key will fall on the / tmp/nginx/e/63 directory. The inactive parameter refers to how long the cache content will be deleted when it has not been *. Here, if there is no * within 24 hours, it will be deleted. In addition to the parameters mentioned here, there are other optional parameters, as shown in the official documentation.
2.add_header Nginx-Cache "$upstream_cache_status", it is not recommended to add this line configuration in the production environment, this configuration is to set a response header flag cache state for debugging.
3.uwsgi_cache_valid 200 36h, this instruction configures the request for which status codes are cached and the cache time. Here, it is cached only when the response status code is 200 and cached for 36 hours.
4.uwsgi_cache_key $request_method_$request_uri$args, set the cached key, here set to request method + request uri+ parameter, you can use other variables provided by nginx as needed.
5.uwsgi_cache_use_stale timeout http_500 http_503, which is used to configure what happens to the back-end application when the expired cache content can be used.
6.uwsgi_cache myapp, which sets the name of the cache space used, corresponding to the keyszone in uwsgi_cache_path. The contexts supported by this directive are http, server, and location, and different spaces can be configured for different interfaces when written in location.
Matters needing attention
After talking about the basic configuration, we went on to talk about some noteworthy points.
The setting of uwsgi_cache_key key can use variables in nginx to meet various complex situations, but it is recommended that you do not set it too complex to meet business needs, because the more complex it is, the lower the rate. If a user's cookie is added as a key, it is based on the user's cache. The * rate is very low only when the same user initiates the same request again.
The setting of uwsgi_cache_valid this instruction must be configured properly, otherwise the response content such as 400 may be cached. In addition to the 200 status codes, responses from other status codes can also be cached as needed, but the caching time should be shorter. For example, cache the response content of 302 for 10s.
After reading the above, do you have any further understanding of how to configure uwsgi service caching in Nginx? If you want to know more knowledge or related content, please follow the industry information channel, thank you for your support.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.