Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How does Python manually write its own LRU cache decorator

2025-03-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

Python how to manually write their own LRU cache decorator, I believe that many inexperienced people do not know what to do, so this article summarizes the causes of the problem and solutions, through this article I hope you can solve this problem.

LRU caching algorithm refers to the least use of the algorithm recently, and the general logic is to eliminate the cache that has not been used for the longest time. Here we use an ordered dictionary to implement our own LRU caching algorithm and package it as a decorator.

1. First, create a my_cache.py file to write our own LRU caching algorithm, with the following code:

Import timefrom collections import OrderedDict''is based on LRU and recently at least used caching algorithms to write decorators.'' Class LRUCacheDict: def _ init__ (self, max_size=1024, expiration=60): self.max_size = max_size self.expiration = expiration self._cache = {} self._access_records = OrderedDict () # record access time self._expire_records = OrderedDict () # record failure time def _ _ setitem__ (self, key Value): # set cache now = int (time.time ()) self.__delete__ (key) # delete all caches that originally used this Key [key] = value self._access_records = now # set access time self._expire_records = now + self.expiration # set expiration time self.cleanup () def _ _ getitem__ (self Key): # Update cache now = int (time.time ()) del self._access_ records [key] # press self._access_ records [key] = now self.cleanup () def _ _ contains__ (self) when deleting the original access Key): # this is the dictionary method self.cleanup () return key in self._cache def _ _ delete__ (self) that calls key by default Key): if key in self._cache: del self._ cache [key] # delete cache del self._access_ records [key] # delete access time del self._expire_ records [key] # delete expiration time def cleanup (self): # used to remove invalid (over size) and expired cache if Self._expire_records is None: return None pending_delete_keys = [] now = int (time.time ()) for k V in self._expire_records.items (): # determine whether the cache is invalid if v

< now: pending_delete_keys.append(k) for del_k in pending_delete_keys: self.__delete__(del_k) while len(self._cache) >

Self.max_size: # determine whether the cache exceeds the length of for k in self._access_records.keys (): # LRU is implemented here. If the cache is used least, then the location where it is stored in the ordered dictionary is self.__delete__ (k) break.

The logic of the code is actually very simple, the comments above are already very detailed, if you don't understand, read it a few more times. What implements the LRU logic here is actually the ordered dictionary OrderedDict, and the first value you put in will be at the front of the dictionary. When a value is used, we re-store the expiration time, causing the frequently used cache to be stored at the back of the dictionary. Once the length of the cached content exceeds the limit, the key at the front of the ordered dictionary (that is, the least recently used) will be called, and the corresponding content will be deleted to achieve the logic of LRU.

2. After changing the algorithm we wrote into a decorator:

From functools import wrapsfrom my_cache import LRUCacheDict def lru_cache (max_size=1024, expiration=60, types='LRU'): if types= = 'lru' or types= =' LRU': my_cache = LRUCacheDict (max_size=max_size, expiration=expiration) def wrapper (func): @ wraps (func) def inner (* args, * kwargs): key = repr (* args) * * kwargs) try: result = my_ cache [key] except KeyError: result = func (* args, * * kwargs) my_ cache [key] = result return result return inner return wrapper

What needs to be explained here is to directly use my_cache [key], a dictionary-like method, which actually calls the _ _ contations__ method in LRUCacheDict, which is also the method in the dictionary to obtain values through key. In this decorator, I have added the parameters of types. You can implement different caching algorithms according to your needs to enrich the function of this decorator. Lru cache itself, in fact, is already a standard library of python and can be called by functools.lru_cache.

After reading the above, have you mastered how Python can manually write its own LRU cache decorator? If you want to learn more skills or want to know more about it, you are welcome to follow the industry information channel, thank you for reading!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report