Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What are the elimination strategies in Redis cache

2025-02-21 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)05/31 Report--

This article focuses on "what are the elimination strategies in Redis cache". Interested friends may wish to take a look. The method introduced in this paper is simple, fast and practical. Let's let the editor take you to learn what are the elimination strategies in Redis cache.

We know that Redis caches use memory to hold data, but the memory size is limited, and as the amount of data to be cached becomes larger and larger, the limited cache space will inevitably be full. At this point, a cached elimination strategy is needed to delete the data.

Elimination strategy of Redis cache

According to the elimination strategy of Redis, they can be divided into two categories according to whether they will be eliminated or not:

Noeviction is the only strategy that does not eliminate data.

Seven other strategies that will be eliminated.

The seven strategies that will be phased out can be further divided into two categories according to the scope of the phase-out candidate dataset:

Elimination is carried out in the data with expiration time set, including volatile-random, volatile-ttl, volatile-lru and volatile-lfu (added after Redis 4.0).

Phase-out is carried out in all data ranges, including allkeys-lru, allkeys-random and allkeys-lfu (added after Redis 4.0).

Before redis3.0, the default is volatile-lru; after redis3.0 (inclusive), and the default phase-out strategy is noeviction

Noeviction strategy

Noeviction means that the data will not be eliminated. When the cache data is full and new write requests come in, Redis will no longer provide services, but will directly return an error.

Elimination strategy based on expiration time

The four strategies of volatile-random, volatile-ttl, volatile-lru and volatile-lfu are for key-value pairs whose expiration time has been set. When the expiration time of the key-value pair arrives or the Redis memory usage reaches the maxmemory threshold, Redis will eliminate the key-value pair according to these policies.

When filtering, volatile-ttl deletes the key-value pairs that have the expiration time set according to the sequence of the expiration time. The earlier the expiration time, the earlier the deletion.

Volatile-random, like its name, is randomly deleted from a key-value pair with an expiration time set.

Volatile-lru uses the LRU algorithm to filter key-value pairs that set the expiration time.

Volatile-lfu uses the LFU algorithm to select key-value pairs that set the expiration time.

Elimination strategy within all data ranges

The scope of data eliminated by the three strategies of allkeys-lru, allkeys-random and allkeys-lfu is extended to all key-value pairs. Regardless of whether or not these key-value pairs have an expiration time, the rules for filtering data elimination are:

Allkeys-random policy to randomly select and delete data from all key-value pairs

The allkeys-lru policy, which uses the LRU algorithm to filter through all data.

The allkeys-lfu policy, which uses the LFU algorithm to filter through all data.

On LRU algorithm

The LRU algorithm is the most commonly used recently. Because LRU uses a linked list to maintain the list of data used, the more data it uses, the more time it takes to move elements, which inevitably affects the Redis main thread. For this reason, Redis simplifies the lru algorithm.

The core idea of the LRU strategy: if a data has just been accessed, it must be hot data and will be accessed again.

According to this core idea, the LRU policy in Redis sets a lru field in the RedisObject structure corresponding to each data to record the access timestamp of the data. During data elimination, the LRU strategy eliminates the data with the lowest lru field value (that is, the data that has been accessed for the longest time) in the candidate dataset.

Therefore, in a business scenario where data is accessed frequently, the LRU policy does effectively retain the data with the most recent access time. Moreover, because the retained data will be accessed again, it can improve the access speed of business applications.

To do this, redis records the timestamp of the most recent access when accessing the key-value pair. When redis decides to eliminate data, N data are randomly selected as a candidate set, and the minimum timestamp is filtered out. When the next time the data is to be eliminated, data that is smaller than the timestamp value of the first selected candidate set is selected to enter the new candidate set. When the data reaches maxmemory-samples, the minimum value is eliminated.

This command allows you to set the number of selected candidate sets CONFIG SET maxmemory-samples N

Use suggestion

According to the characteristics of the strategy, we can choose different strategies for different scenarios to eliminate data.

When there is no obvious difference between hot and cold in cached data, that is, there is little difference in data access frequency, it is recommended to use allkeys-random random strategy to eliminate data.

When there is an obvious difference between hot and cold data, it is recommended to use allkeys-lru or volatile-lru algorithm to leave the most frequently accessed data in the cached data.

When there is a top demand in the business, that is, data that will not expire, the expiration time is generally not set, and the volatile-lru policy can be adopted. In this way, such data will not be eliminated, while other data can be eliminated in accordance with lru rules.

At this point, I believe you have a deeper understanding of "what are the elimination strategies in Redis cache?" you might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report