In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/03 Report--
This article focuses on "how to handwrite a Proxy-based cache". Interested friends may wish to have a look at it. The method introduced in this paper is simple, fast and practical. Let's let the editor take you to learn how to handwrite a Proxy-based cache.
Project evolution
No project is easy. Here are some ideas for writing Proxy caches. I hope it can be of some help to you.
Proxy handler add cache
Of course, in fact, the handler parameter in the agent is also an object, so since it is an object, you can add data items, so we can write memoize functions based on Map cache to improve the recursive performance of the algorithm.
Type TargetFun = (... args: any []) = > V function memoize (fn: TargetFun) {return new Proxy (fn, {/ / currently you can only skip or add a middle tier to integrate Proxy and objects. / / add cache / / @ ts-ignore cache: new Map (), apply (target, thisArg, argsList) {/ / get the current cache const currentCache = (this as any) .cache / / generate key let cacheKey = argsList.toString () of Map directly according to the data parameters / / not currently cached, execute the call, add cache if (! currentCache.has (cacheKey)) {currentCache.set (cacheKey, target.apply (thisArg, argsList));} / / return cached data return currentCache.get (cacheKey);}});}
We can try the memoize fibonacci function, which has a great performance improvement after the proxy function (visible to the naked eye):
Const fibonacci = (n: number): number = > (n {/ / an error occurs, delete the current promise, otherwise it will cause a secondary error / / due to async, the current delete call must be after set, currentCache.delete (cacheKey) / / derive the error return Promise.reject (error)})} currentCache.set (cacheKey, result);} return currentCache.get (cacheKey)
At this point, we can cache not only the data, but also the Promise data request.
Add expired delete function
We can add the timestamp of the current cache to the data and add it when the data is generated.
/ / cached entry export default class ExpiredCacheItem {data: v; cacheTime: number Constructor (data: v) {this.data = data / / add the system timestamp this.cacheTime = (new Date ()) .getTime ()}} / / Edit the Map cache middle layer to determine whether the isOverTime (name: string) {const data = this.cacheMap.get (name) / / has no data (because the currently saved data is ExpiredCacheItem) So let's look at the successful timeout if (! data) return true / / get the current timestamp of the system const currentTime = (new Date ()) .getTime () / get the past seconds of the current time and storage time const overTime = currentTime-data.cacheTime / / if the past seconds are greater than the current timeout time, also return null to the server to get the data if (Math.abs (overTime) > this.timeout) {
When we get to this point, we can do all the functions described in the previous blog. But if it ends here, it won't be fun. We continue to learn the functions of other libraries to optimize my functional library.
Add manual management
Generally speaking, these caches have the ability to manage them manually, so I also provide manual management of caches for business management. Here we use the Proxy get method to intercept property reads.
Return new Proxy (fn, {/ @ ts-ignore cache, get: (target: TargetFun, property: string) = > {/ / if manual management if (options?.manual) {const manualTarget = getManualActionObjFormCache (cache) / / if the currently called function is in the current object, call directly If there is no access to the original object / / even if the current function has this property or method will not be considered, who told you to configure manual management. If (property in manualTarget) {return manualTarget [property]}} / / No manual management is currently configured. Directly access the original object return target [property]},} export default function getManualActionObjFormCache (cache: MemoizeCache): CacheMap {const manualTarget = Object.create (null)
The current situation is not complicated, we can call it directly, and in complex cases, we still recommend using Reflect.
Add WeakMap
When we use cache, we can also provide WeakMap (WeakMap does not have clear and size methods), here I extract the BaseCache base class.
Export default class BaseCache {readonly weak: boolean; cacheMap: MemoizeCache constructor (weak: boolean = false) {/ / whether to use weakMap this.weak = weak this.cacheMap = this.getMapOrWeakMapByOption ()} / / get Map or WeakMap getMapOrWeakMapByOption (): Map according to configuration | WeakMap {return this.weak? New WeakMap (): new Map ()}}
After that, I added various types of cache classes based on this.
Add cleanup function
The value needs to be cleaned when the cache is deleted, and the user needs to provide the dispose function. This class inherits BaseCache and provides dispose calls.
Export const defaultDispose: DisposeFun = () = > void 0 export default class BaseCacheWithDispose extends BaseCache {readonly weak: boolean readonly dispose: DisposeFun constructor (weak: boolean = false Dispose: DisposeFun = defaultDispose) {super (weak) this.weak = weak this.dispose = dispose} / / Clean a single value (called before calling delete) disposeValue (value: v | undefined): void {if (value) {this.dispose (value)} / / Clean all values (called before calling clear method) If the current Map has an iterator) disposeAllValue (cacheMap: MemoizeCache): void {for (let mapValue of (cacheMap as any)) {this.disposeValue (mapValue?. [1])}
If the current cache is WeakMap, there are no clear methods and iterators. Personally, I want to add a middle tier to do all this (still thinking about it, not doing it yet). If WeakMap calls the clear method, I provide the new WeakMap directly.
Clear () {if (this.weak) {this.cacheMap = this.getMapOrWeakMapByOption ()} else {this.disposeAllValue (this.cacheMap) this.cacheMap.clear! ()} add a count reference
In the process of learning other libraries memoizee, I saw the following usage:
Memoized = memoize (fn, {refCounter: true}); memoized ("foo", 3); / / refs: 1 memoized ("foo", 3); / / Cache hit, refs: 2 memoized ("foo", 3); / / Cache hit, refs: 3 memoized.deleteRef ("foo", 3); / / refs: 2 memoized.deleteRef ("foo", 3); / / refs: 1 memoized.deleteRef ("foo", 3) / / refs: 0, clear foo's cache memoized ("foo", 3); / / Re-executed, refs: 1
So I followed suit and added RefCache.
Export default class RefCache extends BaseCacheWithDispose implements CacheMap {/ / add ref count cacheRef: MemoizeCache constructor (weak: boolean = false, dispose: DisposeFun = () = > void 0) {super (weak, dispose) / / generate WeakMap or Map this.cacheRef = this.getMapOrWeakMapByOption ()} / / get has clear according to configuration. Do not list delete (key: string | object): boolean {this.disposeValue (this.get (key)) this.cacheRef.delete (key) this.cacheMap.delete (key) return true;} set (key: string | object, value: v): this {this.cacheMap.set (key, value) / / set and add ref this.addRef (key)
At the same time, modify the proxy main function:
If (! currentCache.has (cacheKey)) {let result = target.apply (thisArg, argsList) if (result?.then) {result = Promise.resolve (result) .catch (error = > {currentCache.delete (cacheKey) return Promise.reject (error)})} currentCache.set (cacheKey, result) / / currently configured refCounter} else if (options?.refCounter) {/ / if it is called again and has already been cached, directly add currentCache.addRef?. (cacheKey)} add LRU
The English full name of LRU is Least Recently Used, which is used least often. Compared with other data structures for caching, LRU is undoubtedly more efficient.
Here, consider adding the max value as well as the maxAge (here I use two Map to do LRU, although it will increase the memory consumption, but the performance is better).
If the saved data item at the current time is equal to max, we directly set the current cacheMap to oldCacheMap and re-new cacheMap.
Set (key: string | object, value: v) {const itemCache = new ExpiredCacheItem (value) / / if there was a value before, modify this.cacheMap.has (key) directly? This.cacheMap.set (key, itemCache): this._set (key, itemCache); return this} private _ set (key: string | object, value: ExpiredCacheItem) {this.cacheMap.set (key, value); this.size++; if (this.size > = this.max) {this.size = 0; this.oldCacheMap = this.cacheMap; this.cacheMap = this.getMapOrWeakMapByOption ()}}
The key point is to obtain data. If there is a value in the current cacheMap and does not expire, return it directly. If not, go to oldCacheMap to find it. If so, delete the old data and put in the new data (using the _ set method). If there is none, return undefined.
Get (key: string | object): v | undefined {/ / if cacheMap exists, return value if (this.cacheMap.has (key)) {const item = this.cacheMap.get (key); return this.getItemValue (key, item!);} / / if there is if (this.oldCacheMap.has (key)) {const item = this.oldCacheMap.get (key) in oldCacheMap / / No expired if (! this.deleteIfExpired (key, item!)) {/ / move to the new data and delete the old data this.moveToRecent (key, item!); return items.data as V;}} return undefined} private moveToRecent (key: string | object, item: ExpiredCacheItem) {/ / delete the old data this.oldCacheMap.delete (key); organize the memoize function
At this point, we can free ourselves from the previous code details and take a look at the interfaces and main functions based on these functions.
/ / oriented to the interface, regardless of whether other types of cache classes export interface BaseCacheMap {delete (key: K): boolean; get (key: K): v | undefined; has (key: K): boolean; set (key: K, value: v): this; clear? (): void; addRef? (key: K): void; deleteRef? (key: K): boolean } / / Cache configuration export interface MemoizeOptions {/ * * serialization parameters * / normalizer?: (args: any []) = > string; / * * whether to use WeakMap * / weak?: boolean; / * * maximum number of milliseconds, maximum number of items deleted * / maxAge?: number; / * * out of date, exceeding deletion * /
The final memoize function is actually similar to the original function, only doing three things.
Check the parameters and throw an error
Get the appropriate cache according to the parameters
Return to proxy
Export default function memoize (fn: TargetFun, options?: MemoizeOptions): ResultFun {/ / checks the parameters and throws an error checkOptionsThenThrowError (options) / / modifies the serialization function const normalizer = options?.normalizer? GenerateKey let cache: MemoizeCache = getCacheByOptions (options) / / returns proxy return new Proxy (fn, {/ / @ ts-ignore cache, get: (target: TargetFun) Property: string) = > {/ / add manual management if (options?.manual) {const manualTarget = getManualActionObjFormCache (cache) if (property in manualTarget) {return manualTarget [property]}} return target [property]}, apply (target, thisArg, argsList: any []): v {
The complete code is in memoizee-proxy. Everyone operates and plays by themselves.
Next step
test
Test coverage is not everything, but in the process of implementing the library, the JEST test library provides me with a lot of help. It helps me rethink the functionality and parameter validation that every class and function should have. I always checked the previous code at the main entry of the project, without thinking deeply about the parameters of each class or function. In fact, this robustness is not enough. Because you can't decide how users use your library.
Proxy in depth
In fact, the application scenario of the agent is unlimited. Ruby has verified this (you can learn "ruby metaprogramming").
Developers can use it to create a variety of coding patterns, such as (but not limited to) tracking property access, hiding properties, preventing properties from being modified or deleted, function parameter validation, constructor parameter validation, data binding, and observable objects.
Of course, Proxy comes from ES6, but the API still requires a higher browser version, although there is proxy-pollfill, but the functionality is limited. However, it is already 2021, I believe it is time to learn more about Proxy.
Deep caching
Caching is harmful! There is no doubt about that. But it's so fast! So we need to have a better understanding of the business, which data needs to be cached, and which data can be cached.
The current write cache is only for one method, can later write items be combined to return data in a more fine-grained way? Or do you think more up and write a set of cache layers?
Small-step development
In the process of developing the project, I used a small step to run, constantly reworking. At the beginning of the code, only to add expired delete function to the stage.
But every time I finish a new function, I restart the logic and process of organizing the library, trying to make the code elegant enough each time. And because I don't have the ability to think about it the first time I write it. However, I hope that we will continue to make progress in our future work. This also reduces code rework.
Other
Function creation
In fact, when I added manual management to the current library, I considered copying the function directly, because the function itself is an object. At the same time, add methods such as set for the current function. But there is no way to copy the scope chain.
Although it didn't succeed, I learned something, and here are two pieces of code to create the function.
We basically use new Function to create functions when we create them, but the browser doesn't provide a constructor that can create asynchronous functions directly, so we need to get them manually.
AsyncFunction = (async x = > x). Constructor foo = new AsyncFunction ('x, y, paired, 'return x + y + await p') foo (1m 2, Promise.resolve (3)). Then (console.log) / / 6
For global functions, we can also directly fn.toString () to create functions, when asynchronous functions can also be constructed directly.
Function cloneFunction (fn: (... args: any []) = > T): (. Args: any []) = > T {return new Function ('return' + fn.toString ()) ();} at this point, I believe you have a deeper understanding of "how to handwrite a Proxy-based cache library". You might as well do it in practice! Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.