In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/03 Report--
The following will give you an explanation of how to configure nginx load balancing, hoping to give you some help in practical application. Load balancing involves a lot of things, and there are not many theories. There are many books on the Internet. Today, we will use the accumulated experience in the industry to do an answer.
1. Load balancing
First of all, let's talk about what load balancing is. Load balancer is the most basic component of a high-availability architecture, because load balancers can distribute many requests to different backend CVMs to provide services for clients. Even if some of the machines cannot provide services for some reason, it will not affect the use of the whole system. Similarly, because requests are evenly distributed to different back-end servers, a single server does not have to bear too much load, and the client will have a better user experience.
2. Configure the instance
The tomcat at the back end runs the same application, but the ip is different. Basically, the three tomcat servers at the back end are regarded as one.
Just make sure that the front-end nginx proxy server can access the back-end tomcat server, even if it is not in the same network segment.
The default tomcat is running normally and can be accessed. Let's take a look at the configuration of nginx.
Configuration:
Http {
Upstream testproject {
Server 192.168.8.2:8080
Server 192.168.8.3:8080
Server 192.168.8.4:8080 backup
}
Server {
Listen 80
Server_name www.test.com/192.168.8.5
Location / {
Proxy_pass http://testproject;
}
}
}
The above is a very simple load balancer configuration, and there are not many other configurations. As long as it is configured like this, the load balancer is basically configured.
Explanation:
The field that upstream wants to configure in http.
Upstream streamname the whole streamname can be named whatever you want, and define it yourself.
To configure the server to be loaded in upstream, just fill in the address + port that can be accessed by the backend.
The status in the schedule can also be added to the backend of the proxy server.
Down
This server does not perform load balancing.
Backup
Enable this server when all machines participating in the load are unable to provide services
Max_fails
The number of times the request is allowed to fail
Fail_timeout
Time of service suspension after max_fails failure
Max_conns
Maximum number of connections
Configuration in server
Listen 80; listen on port 80
Server_name url/ip; if it is a private network, as long as it is configured with ip, if it is a public network, configure the domain name.
Proxy_pass is configured in location and followed by the name of the load balancer defined by http:// itself.
Actually, the configuration is very simple.
After reading the above explanation on how to configure load balancer in nginx, if there is anything else you need to know, you can find what you are interested in in the industry information or find our professional and technical engineer to answer, the technical engineer has more than ten years of experience in the industry.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.