Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to configure load balancing of nginx on Centos7

2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

How to configure nginx's Load Balancer on Centos7. For this problem, this article introduces the corresponding analysis and solution in detail, hoping to help more small partners who want to solve this problem find a simpler and easier way.

Load Balancer Configuration of nginx on Centos7 Preface

Before configuring nginx Load Balancer. We need to understand the concept of a few nouns

Note: If you accidentally forget tomcat and nginx startup and shutdown commands, please refer to the command written at the end of the article.

An important conceptual understanding

1 What is nginx?

Nginx is a web server and reverse proxy server for HTTP, HTTPS, SMTP, POP3 and IMAP protocols.

2 What is a reverse proxy?

Reverse proxy hides the real server, when we request www.baidu.com, just like dialing 10086, there may be thousands of servers behind us, but which one, you don't know, don't need to know, you just need to know who the reverse proxy server is, www.baidu.com is our reverse proxy server, reverse proxy server will help us forward the request to the real server. Nginx is a reverse proxy server with very good performance, used to do Load Balancer.

3 What is Load Balancer?

Load Balancer is a method of distributing tasks to multiple server-side processes. For example, dispatching an HTTP request to an actual Web server involves implementing Load Balancer. There are multiple processes involved in an HTTP request arriving at a Web server, and there are many different ways to Load Balancer.

4 What is Load Balancer?

forwarding function

According to a certain algorithm [weight, polling], the client request is forwarded to different application servers, reducing the pressure on a single server and improving the system concurrency.

fault removal

Through heartbeat detection, it is judged whether the application server can work normally at present. If the server is down, it will automatically send the request to other application servers.

restore add

If a failed application server is detected, it is automatically added to the queue for processing user requests.

5 What is the distribution strategy of Load Balancer?

Nginx upstream currently supports allocation algorithms:

1), Polling--1: 1 Take turns processing requests (default)

Each request is assigned to a different application server in chronological order. If the application server goes down, it will be automatically eliminated, and the rest will continue to poll.

2)Weight-you can go up

Specify polling probability by configuring weights that are proportional to access ratio for uneven application server performance.

3)ip_hash algorithm

Each request is allocated according to the hash result of the access ip, so that each visitor has fixed access to an application server, which can solve the problem of session sharing.

II Configuration Load Balancer

This simulation is not equipped on multiple machines, we put three tomcat are installed on a machine under different directories, respectively, to give them different ports, modify the next

Their home page, simulate Load Balancer

First: Preparation

One nginx server, three tomcat servers

Nginx installation I have written before, comrades can refer to this article

tomcat is even simpler, directly upload decompression, you can access ip in the browser to use, can not open, close the firewall try

My installation directory

nginx /opt/nginx/nginx-1.8.0

tomcat /opt/tomcat/apache-tomcat-7.0.57

The requested URL/opt/tomcattest1/apache-tomcat-7.0.57 was not found on this server.

The requested URL/opt/tomcattest2/apache-tomcat was not found on this server.

We use tomcat to Load Balancer test1 and test2

Second: Modify tomcat profiles No. 1, No. 2 and Home No. 1 tomcat

Note: Before modifying tomcat configuration file, if tomcat is started, turn it off and modify it again.

The requested URL/opt/tomcattest1/apache-tomcat-7.0.57/conf/server.xml was not found on this server.

Change three places and add 1 to all three uncommented ports

Note: If you want to change the unannotated, annotated changes are useless.

8005->8006

8080->8081

8009->8010

The requested URL/opt/tomcattest1/apache-tomcat-7.0.57/webapps/ROOTvi index.jsp was not found on this server. Start tomcat

Access in browser

http://192.168.220.111/8081 Use your own ip two two tomcat

As above, add 2 to all three ports you want to modify

Note: if you want to change the unannotated, annotated changes are useless

8005->8007

8080->8082

8009->8011

browser accesses

http://192.168.220.111/8082 Use your own ipThird: Configure Load Balancer

Note: Before modifying the nginx configuration file, if you start nginx, turn it off and modify it again.

Modify nginx configuration file

vi /usr/local/nginx/conf/nginx.conf

Shift+g Jump to the bottom and join before the last}

upstream tomcatserver1 { server 192.168.220.111:8081 ; server 192.168.220.111:8082 ; } server { listen 80; server_name love.com; #charset koi8-r; #access_log logs/host.access.log main; location / { proxy_pass http://tomcatserver1; index index.html index.htm; } }

explain

One upstream corresponds to one server.

When we enter love.com (write a favorite domain name) in the browser, we will first find http://tomcatserver1 after proxy_pass in location,

Then find the upstream corresponding to tomcatserver1 with the same name, and then go in and find the ip of the proxy.

At this point, we also need to add our ip to the local hosts to be accessed in the browser

Under Windows, enter

C:\Windows\System32\drivers\etc directory, open the hosts file, add at the bottom

192.168.220.111 love.com Your own ip and your own configured domain name.

Then start nginx

Enter http://192.168.220.111 in your browser

Keep refreshing the page, you will find that tomcat No.1 and tomcat No.2 appear almost alternately, that is, polling among the three distribution strategies of Load Balancer mentioned at the beginning.

Almost 1:1 appears.

Extension:

However, in reality, assuming that the performance of our multiple servers is not the same, we hope that the performance of the No. 1 server multi-proxy url can be carried out.

Weight or hash configuration

----------------------------

Here are a few commands that might be used

Tomcat Common Commands

Start Tomcat: Go to tomcat directory/bin, then./ startup.sh Stop Tomcat: Go to tomcat directory/bin and then./ shutdown.sh

Nginx common commands

Note that nginx configuration port configuration file is in/usr/local/nginx/conf and not in the decompression directory to start nginxcd/usr/local/nginx/sbin/./ There are two ways to restart nginx: 1. Shut it down and restart it./ nginx -s quit Or.../ nginx -s stop./ nginx2 restart directly./ nginx -s reload Check if nginx and its corresponding directory are installed find /|grep nginx.conf View ports occupied by nginx ps -ef| grep nginx needs to kill, then kill -9 xx on how to configure nginx on Centos7 Load Balancer questions to share here, I hope the above content can be of some help to everyone, if you still have a lot of doubts not solved, you can pay attention to the industry information channel to learn more related knowledge.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report