Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to solve the problem of using leaky Bucket algorithm to limit current in ASP.NET Core

2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly introduces how to solve the problem of using leaky bucket algorithm to limit current in ASP.NET Core, which has a certain reference value. Interested friends can refer to it. I hope you will gain a lot after reading this article.

Leaky bucket algorithm is one of the four mainstream current-limiting algorithms, which is not introduced in all kinds of data in its application scenario, which is generally used in network traffic control. Here are two examples:

1. At present, families surfing the Internet will limit a fixed bandwidth, such as 100m, 200m, etc., and there are many users in a building, so how can operators ensure that some users do not use too much bandwidth, thus affecting others? At this point, you can use the leaky bucket algorithm to limit the maximum bandwidth that each user can access the network, which is actually much more complicated.

2. There is an ancestral interface, which was written without any protection at that time, but now it will crash with a little more traffic, but no one can change the code. At this time, you can also use the leaky bucket algorithm to encapsulate the interface, rectify the external request through the leaky bucket algorithm, and then forward it to the interface. At this time, the access frequency will not exceed the threshold and the interface will not crash.

Algorithm principle

Having said so much, how on earth does the leaky bucket algorithm solve the problem? Please look at the picture below.

After receiving the request, first put the request into a leaky bucket, the leaky bucket leaks out the request at a constant rate, and then the leaked request is processed; if the request is received so fast that the leaky bucket is full, the new request is discarded.

It can be seen that the leaky bucket algorithm is mainly output at a constant speed to give a stable input to the subsequent data processing. In this way, it can deal with a certain burst of traffic, so that the system will not crash because of the sudden increase in the number of requests, but by increasing the delay, there will be a little waste of resources, which is different from the way of dealing with token buckets. You can see this article about token bucket algorithm: using token buckets to limit flow in ASP.NET Core.

Another benefit that is not often mentioned is that constant-speed output can sometimes improve efficiency. For example, if you allow two requests to be missed at a time, you can merge two processes into one, and if each processing involves network IO, then merge processing has the opportunity to reduce the overhead of network IO.

Algorithm realization

Here we talk about two implementation methods: the in-process memory leaky bucket algorithm and the Redis-based leaky bucket algorithm.

In-process memory leaky bucket algorithm

Here, the number of leaks is calculated at the time of the request, there is no separate leak processing, and the algorithm described is a little more complex, but it only requires a little more patience and is easy to understand.

Let's first define a few variables:

For the leakage rate, it is expressed as [Y per X time period]. The X time period is generally several seconds, minutes, hours and other time spans.

The start time of the current time period is expressed as Ts, the end time of the current time period is expressed as Te, and the current time is expressed as Ti.

For leaky bucket capacity, it is represented by Z.

The number of all requests in X time is expressed as N.

When the request arrives, it can be processed in the following order:

If Ti-TsX, you need to create a new time period:

Several time periods have passed: Pn=math.ceiling ((Ti-Te) / X)

Reset the values of Ts and Te: Ts= 's last Ts+Pn*X,Te=Ts+X

Calculate the maximum amount that can be leaked during this period: Yo=Pn*Y

Calculate the value of N: n = N-Yo {/ / determine whether the current request needs to limit the processing of return true }, Name= "leaky bucket limit rule",}});.} public void Configure (IApplicationBuilder app, IWebHostEnvironment env) {. App.UseRateLimit ();...}

As above, you need to register the service first, and then use middleware.

When registering for a service, you need to provide a current-limiting algorithm and corresponding rules:

The in-process leaky bucket algorithm InProcessLeakyBucketAlgorithm is used here, and you can also use RedisLeakyBucketAlgorithm, which requires passing in a Redis connection. Both algorithms support synchronous and asynchronous methods.

The capacity of the leaky bucket is 20, the quantity leaked per unit time is 10, and the leaking unit time is 1 second. In other words, if 10 requests are leaked in 1 second, the processing of more than 10 requests in 1 second will be delayed. With the capacity of the leaky bucket, more than 30 requests in 1 second will be limited.

ExtractTarget is used to extract current limit targets. Here is each different request Path. You can extract key data from the current request according to the requirements, and then set various current limit targets. If there is an IO request, the corresponding asynchronous method ExtractTargetAsync is also supported here.

CheckRuleMatching is used to verify whether the current request is limited, and the object passed in is also the current request, so it is convenient to extract key data for verification. If there is an IO request, the corresponding asynchronous method CheckRuleMatchingAsync is also supported here.

HttpStatusCode 429 is returned when the current is restricted by default, which can be customized with the optional parameter error during AddRateLimit, as well as the contents of Http Header and Body.

The basic use is these in the above example.

If it is still based on the traditional .NET Framework, you need to register a message processor RateLimitHandler in Application_Start. The algorithms and rules are shared. For more information, please see the instructions on Github: https://github.com/bosima/FireflySoft.RateLimit.

FireflySoft.RateLimit is a current-limiting class library based on .NET Standard. Its kernel is simple and lightweight, and it can flexibly respond to current-limiting scenarios with various requirements.

Its main features include:

A variety of current-limiting algorithms: built-in fixed window, sliding window, leaky bucket, token bucket four algorithms, but also custom expansion.

A variety of count storage: memory and Redis are currently supported.

Distributed friendly: unified counting of distributed programs is supported through Redis storage.

Flexible current limit target: all kinds of data can be extracted from the request to set the current limit target.

Support current limit penalty: the client can be locked out of access for a period of time after it triggers the current limit.

Dynamic change rules: support the dynamic change of current-limiting rules while the program is running.

Custom error: you can customize the error code and error message after the current limit is triggered.

Universality: in principle, it can satisfy any scenario that needs to limit the current.

Thank you for reading this article carefully. I hope the article "how to solve the problem of using leaky bucket algorithm to limit current in ASP.NET Core" shared by the editor will be helpful to everyone. At the same time, I also hope that you will support and pay attention to the industry information channel. More related knowledge is waiting for you to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report