Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What are the knowledge points cached by Java

2025-03-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

Today, I would like to share with you the relevant knowledge points of Java cache, which are detailed in content and clear in logic. I believe most people still know too much about this knowledge, so share this article for your reference. I hope you can get something after reading this article.

First, what can cache be used for?

Most people's understanding of caching is that when we open a page or an APP, when they open very slowly, we will want to introduce caching, so that it will open faster.

From a technical point of view, caching can improve access speed because caching is based on memory. The reading and writing speed of memory is much faster than that of hard disk, so using memory instead of hard disk as the medium of reading and writing will naturally greatly improve the speed of accessing data.

Second, the mode of application: pre-reading and delayed writing

In addition to the above process, there are two other important uses of caching: pre-reading and deferred writing.

2.1 pre-read

Literally, it is read in advance, and the actual meaning is indeed so. Pre-read is to load the data to be read in advance, that is, to load part of the data in the hard disk into memory in advance in the system, and then provide services.

So what's the point of doing this?

Because tens of thousands of requests will pour in once some systems are started, if you let these requests go directly to the database, it is very likely that the pressure on the database will increase sharply, and the database will be hung up, resulting in a normal response.

Pre-reading is to solve such a problem.

2.2 delayed Writing

If pre-read is to add a buffer at the data exit, then delaying write is to add a buffer at the data entry.

Because the writing speed of the database is slower than the reading speed, a series of mechanisms are needed to ensure the correctness of the data when writing. Therefore, if you want to improve the write speed, either divide the database and table, or add a buffer through the cache, and then write to the disk in batches at one time. The complexity of introducing sub-library and sub-table is much higher than that of introducing cache, and generally, the scheme of introducing cache is given priority.

This cache scheme is to delay write, which is to write the data that is going to be written to disk or database temporarily into memory in advance, then return to success, and then write the data in memory to disk in batches.

Third, which can be cached?

What do we need to cache before caching? What characteristics of data need to be cached? Because caching is an additional cost, caching is added to reflect its value.

First, introduce two criteria for measuring data:

Hot spot data: accessed by high frequency, such as dozens of times per second.

Static data: rarely changed, reading is greater than writing.

Each setup point will block some traffic, resulting in the following funnel effect, which can protect the subsequent system and the final database.

These stations are like traffic lights, if there is no traffic light, it is prone to accidents, or cause traffic paralysis and so on. The cache setting point is to prevent a large number of requests from pouring in, resulting in normal access.

IV. Cache category

Several categories of caches have been listed above, and we will introduce them next.

4.1 browser caching

The browser is the closest to the user and can be used as a cache, and with the help of the user's resources, the performance-to-price ratio is the best among several, allowing users to share some of the pressure.

The developer tools that enter the browser, such as from cache, from memory cache and from disk cache, indicate that the data has been cached on the user's terminal device, and this is the reason why some of the content may be accessed when there is no network.

Browsers will help us complete this process, which is generally suitable for caching these resources such as images, js, and css.

The disadvantage of browser cache is that we have poor control over it, and we cannot actively update data without initiating a new request.

4.2 CDN cach

Service providers who provide CDN services have deployed a large number of service nodes throughout the country and even all over the world. We can distribute the data to the distributed servers as a cache, and when the user accesses it, we can read the cached data on the nearest server. In this way, you can spread the pressure and improve the acceleration effect.

It is important to note that due to the large number of nodes, updating cached data is slow, generally at least at the minute level, so the cache is suitable for static data that does not change frequently.

4.3 Gateway (proxy) caching

We often add a layer of network management in front of the origin server in order to provide some security mechanism or serve as an entrance to the same diversion policy.

Set up a cache here to stop the request, and the origin server behind it is also very profitable, reducing a lot of CPU operations. The commonly used gateway caches are Varnish, Squid and Nginx.

4.4 in-process caching

A request can come here, indicating that it is "business-related" and needs to be calculated by business logic. From here on, the cost of introducing caching is much higher than the first three, because there is a higher requirement for 'data consistency' between the cache and the database.

4.5 out-of-process cache

This is a place that most programmers are familiar with, such as Redis and Memcached, or you can write a separate program to store cached data and provide it to other programs for remote calls.

4.6 Database caching

Database caching is the internal mechanism of the database, it is generally given to set the size of the cache space configuration to allow you to intervene.

Finally, the disk itself is cached, so it has experienced twists and turns to be able to write data to the disk smoothly.

5. Problems that may arise in caching

Since caching is so powerful, isn't it better to have as many caches as possible? As long as the speed is slow, add a cache to solve the problem? In fact, caching has both a good side and a negative side.

5.1 cache avalanche

Problem: when a large number of requests enter the cache concurrently, the buffering effect may not be executed normally for some reason, even in a very short time, it will cause all the requests to be transferred to the database, resulting in excessive pressure on the database.

Solution: this kind of problem can be solved by "locking queue" or "increasing cache time by random value".

5.2 Cache traversal

It is similar to a cache avalanche, except that the penetration lasts longer. This is because after each cache miss, it is still unable to load data from the data source into the cache, resulting in a continuous generation of cache miss.

This kind of problem can be solved by "Bloom filter" or "caching empty objects".

5.3 caching concurrency

How to ensure the accuracy of the business when the data under a cached key is set at the same time? If the cache of in-process, out-of-process and database is used together?

Use the "DB before caching" approach, and the cache operates as delete instead of set.

5.4 Cache bottomless pit

Although distributed cache can be scaled out wirelessly, the more nodes in the cluster, the better. Caching also conforms to the law of "diminishing marginal utility".

5.5 Cache obsolescence

The capacity of memory is limited, and if the amount of data requested is large, it is necessary to implement a certain elimination strategy according to the specific situation. For example: LRU, LFU, FIFO, and so on.

These are all the contents of the article "what are the knowledge points cached by Java". Thank you for reading! I believe you will gain a lot after reading this article. The editor will update different knowledge for you every day. If you want to learn more knowledge, please pay attention to the industry information channel.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 284

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report