In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article introduces the knowledge about "how to understand Python standard library Lru_cache". In the actual case operation process, many people will encounter such difficulties. Next, let Xiaobian lead you to learn how to deal with these situations! I hope you can read carefully and learn something!
New levels of acceleration: Speed up your functions with just one line of code by using simple caching features.
Not long ago, I built a day-to-day ETL pipeline that enriches input data by extracting data from external services and then loading the results into a database.
As input data increases, waiting for a response from an external server becomes time-consuming, which makes the ETL process slower and slower. After some investigation, I found that there were not many different input values (~ 500) compared to the total number of records (~ 500k).
So, in other words, when an external service is invoked with the same parameters, each parameter is repeated approximately 1000 times.
Situations like this are the primary use cases for caching. Caching a function means that whenever the return value of the function is first evaluated, its inputs and results are placed in the dictionary.
For each subsequent function call, first check whether the result has been evaluated by looking at the cache. If it's found in the cache, it's perfect and doesn't need to be calculated again! If not, the result is computed and the input and result are stored in a cache so that it can be found on the next function call.
The Python standard library comes with many little-known but powerful packages. For this example, lru_cache from functools will be used. (LRU stands for "Least Recently Used," which literally means that the cache will retain the most recent input/result pairs.)
Import lru_cache from Fun(c)tools
Putting c in parentheses is a bit of a bad joke, because functools become fun tools, and caching is fun!
There is no need to explain too much here. Import lru_cache and use it to decorate a function that generates Fibonacci numbers.
Decorating a function means wrapping it with a cached function, and then calling the cached function whenever fib_cache is called.
start of the game
We ran an experiment to calculate the time it took for cached and uncached versions of the function to calculate all Fibonacci numbers from 0 to 40 and put the results in their respective lists.
winner
For smaller Fibonacci numbers, there is no big difference, but once you get to about 30 samples, the efficiency gains of the buffer function start to accumulate.
I don't have the patience to let the uncached version run for more than 40 samples because its run time grows exponentially. For cached versions, the runtime is only a linear increment.
"How to understand Python standard library Lru_cache" content is introduced here, thank you for reading. If you want to know more about industry-related knowledge, you can pay attention to the website. Xiaobian will output more high-quality practical articles for everyone!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.