Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

The Criterion of garbage Collection and introduction of memory related parameters in JVM

2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly explains the "JVM garbage collection criteria and memory-related parameters introduction", the article explains the content is simple and clear, easy to learn and understand, now please follow the editor's ideas slowly in-depth, together to study and learn "JVM garbage collection criteria and memory-related parameters introduction" bar!

Criteria for garbage collection in JVM

The ultimate goal is to recycle useless objects in memory. The specific judgment methods are as follows:

The reference counting method, which is not used, refers to the number of times an object is referenced, and a number of 0 means garbage.

The reachability algorithm-GC Roots tracing, refers to traversing all referenced objects down from GC Roots (each GC Root is a tree view), all referenced objects are objects that need to survive, and other objects can be recycled. GC Root refers to objects referenced in the virtual machine stack (local variables in stack frames), objects referenced by non-basic class static variables (an address) in the method area, and objects referenced by JNI (commonly known as Native methods) in the local method stack.

Memory related parameters in JVM

-initial Xms Java heap memory size

-maximum Xmx Java heap memory

-the Cenozoic size in the Xmn Java heap memory, which is the old size after deducting it.

-XX:PermSize (after 1.8s:-XX:MetaspaceSize) initial size of permanent generation

-XX:MaxPerSize (after 1.8:-XX:MaxMetaspaceSize) maximum size of the permanent generation

-Xss stack memory size per thread

Note: normally, Xms and Xmx,-XX:PermSize and-XX:MaxPerSize are set to the same.

-how old is XX:MaxTenuringThreshold in her old age?-default is 15.

-how many bytes of large objects over XX:PretenureSizeThreshold go directly into the old age

In the case of-XX:HandlePromotionFailure MinorGC, if the remaining space of the old age is less than the total size of the new generation object, but if it is larger than the previous average size of the old age object, whether to try MinorGC (enabled by default)

-proportion of XX:SurvivorRatio=8 Eden area

Do not delve into the parameters that you do not understand above, and then look back later, here is just a list of all the parameters to facilitate search.

Memory Generation Model in JVM

In JVM, objects are divided into three generations in memory:

Younger generation: objects that are quickly recycled exist in the heap and are divided into 1 eden section and 2 survivor regions in memory.

Old age: a long-standing object that exists in the heap

Permanent generation: refers to the method area (storing Class metadata). The recovery conditions are harsh and need to be met: all instance objects of this class have been reclaimed from heap memory, this class classLoader has been reclaimed, and this class Class object does not have any references.

Why should we divide it into generations? because there are different garbage collection algorithms and memory allocation mechanisms for each age generation. If all objects are put together, the first is that it will cause frequent traversal to determine the cost of recovery, and the second is that it will cause the cost of copying and moving, why there will be copying and moving, because recycling memory will inevitably cause memory fragments, and memory fragments will lead to space waste, so it must be cleaned up casually by copying and moving to make the free memory continuous.

Specific memory allocation Model in JVM

As shown in the figure above, why the younger generation is allocated in this way has something to do with a specific recycling algorithm.

How to transfer objects to the younger generation in memory generation

Most objects are allocated to the younger generation's Eden area when they are created. As long as there is not enough space in the younger generation, MinorGC will be triggered (only the younger generation's memory will be reclaimed). MinorGC uses the replication algorithm to recycle. When the first round of minorGC is triggered by the JVM run, the surviving objects in the eden area will be copied to a suprivor area first. Then delete the eden area object, and when the next round of minorGC is triggered, the surviving objects in the suprivor area and eden area are transferred to another suprivor area, and then all objects in these two zones are deleted. And so on. As for why to use the replication algorithm, including the old tag demarcation algorithm, this is to avoid memory fragmentation. If the object memory is not contiguous, it will cause a lot of space waste.

Old age

The objects of the old age are all transferred from the younger generation according to certain rules. There are several specific ways of circulation:

Beyond the specified age (parameter-XX:MaxTenuringThreshold configuration, default is 15). Here, age means that it has not been garbage collected, and survival is understood as an increase of one year at a time. Transfer to the old days.

Large objects enter directly, and large objects that exceed the number of bytes set by the parameter specified number of bytes (- XX:PretenureSizeThreshold) will directly enter the old age, because the larger the object, the greater the replication overhead.

The dynamic age judgment rule enters, which means that it is not necessary to transfer to 15 at the specified age. If the object above a certain age reaches a certain size, it will also enter the old age ahead of time. When the total amount of objects who have avoided a round of GC exceeds 50% of the surrvivor area, such as age 1 + age 2 + age n adds up until the age n is found to add up to more than 50% of the surrvivor space, then the objects above age n directly enter the old age.

When minorGC occurs, the suprivor area can not be put down, then all living objects are transferred to the old age. This involves an old age allocation guarantee rule, which means that every time MinorGC occurs, it will judge whether the available memory of the old age is greater than the sum of the memory of the surviving objects of the younger generation. If it is greater than that, the minorGC will be performed directly. If it is less than, it depends on whether the parameter XX:HandlePromotionFailure is enabled (default is enabled). If it is enabled, the memory of the transferred object that needs to be carried this time in the old era will be estimated (take the average memory size of the previous minorGC transferred). If it is greater than, MinorGC is also performed. If the memory is transferred beyond the available space in the old years, then FullGC is performed. If fullGC is still not enough, an OOM error is thrown. FullGC is a tagging algorithm, which refers to moving living objects, making memory continuous, and then deleting objects that need to be recycled. Why use tagging? Because it is considered that the survival rate of objects in the old era is high, the replication algorithm is not cost-effective.

Permanent generation

The permanent generation stores metadata information. When the class is loaded, the class metadata information is written to the permanent generation, and the permanent generation data is reclaimed during fullGC. The recovery condition is: all instance objects of the class have been reclaimed from heap memory, the classLoader of this class has been recovered, and the Class object of this class does not have any references.

Attached picture:

Talking about an example of JVM Optimization

Now a computing system with a daily processing capacity of more than 100 million data is constantly extracting data from Mysql and other data middleware for calculation and processing. Perform 500 data extraction and computing tasks per minute, each task takes 10 seconds to process 10,000 pieces of data (20 fields per data), but cluster deployment, a total of 5 machines, one machine processes 100 tasks per minute, each machine is configured in 4core 8G, JVM is divided into 4G 3G heap memory, 1.5G younger generation, 1.5G old age.

Let's first estimate the memory footprint:

There are 20 fields for each piece of data, and one piece of data can be estimated to be about 1KB size.

Each time 1W is calculated, then a task takes up about 1KB*1W=10MB data. A machine processes 100 tasks per minute, and the temporary memory is about 1G. Basically, the Eden area is full after more than a minute.

What is the actual production environment like?

The first GC a minute later, the memory situation at this time:

Processing 10s for each task means that there is still about 1/6 of the data that should be alive, which is 200m, but 200m can not be put into the Survivor area, so I will try to put it back to the old age, which is now larger than 1.2g, so just put it directly.

About 200m of each MinorGC will enter the old age, and when it comes to the third time,

At this time, the available capacity of the old age is less than the total memory of the younger generation. By default, it determines whether the remaining space of the old age is greater than the average capacity of the old age object transferred by MinorGC, and whether it is larger here, so continue to MInorGC.

When it comes to the eighth time,

At this point, FullGC will be triggered to clean up the old age, and all the objects of the old era will be cleaned up:

Then go back to the first time and FullGC every eight times, that is, every eight minutes.

Optimization strategy

Re-adjust the proportion of the old age of the new generation, and expand the memory of the new generation to 2GB and the old 1GB.

At this time, there is a 200MB in a Survivor area, and the surviving objects can be stored after each MinorGC, and there is no need to transfer to the old age (of course, there is still a transfer, which just avoids the objects that are forced to be transferred when the suprivor area is too small). The core of the strategy optimized by JVM is to reduce the number of FullGC, because the scanning object has an old age and permanent generation, the permanent generation marking algorithm is slightly complex, and because there are many objects in the old age, the efficiency of FullGC is much lower than that of MinorGC, and the general time is more than 10 times that of minorGC.

Thank you for your reading, the above is the "JVM garbage collection criteria and memory-related parameters introduction" content, after the study of this article, I believe you on the JVM garbage collection criteria and memory-related parameters to introduce this issue has a deeper understanding, the specific use of the need for you to practice and verify. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report