In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces "how to understand Java just-in-time compilation". In daily operation, I believe many people have doubts about how to understand Java real-time compilation. The editor consulted all kinds of data and sorted out simple and easy-to-use operation methods. I hope it will be helpful for you to answer the doubts about "how to understand Java real-time compilation"! Next, please follow the editor to study!
Brief introduction
When the initialization of the JVM is complete, during the call execution of the class, the execution engine converts the bytecode into machine code before it can be executed in the operating system. In the process of converting bytecode to machine code, there is also a compilation in the virtual machine, which is instant compilation.
Initially, the bytecode in JVM is compiled by the interpreter (Interpreter), and when the virtual machine finds that a method or block of code is running very frequently, it will identify the code as hot code.
In order to improve the execution efficiency of hot code, at run time, the just-in-time compiler (JIT,Just In Time) will compile the code into machine code related to the local platform, optimize it at all levels, and then save it to memory.
classification
In the HotSpot virtual machine, there are two kinds of JIT, which are C1 compiler and C2 compiler. The compilation process of these two compilers is different.
C1 compiler
C1 compiler is a simple and fast compiler, which mainly focuses on local optimization, which is suitable for programs with short execution time or requirements for startup performance, also known as Client Compiler. For example, GUI applications have certain requirements for interface startup speed.
C2 compiler
C2 compiler is a compiler that optimizes the performance of long-running server-side applications. It is suitable for programs with long execution time or peak performance requirements, also known as Server Compiler. For example, long-running Java applications on the server have certain requirements for stable operation.
Hierarchical compilation
Before Java7, the corresponding JIT needs to be selected according to the characteristics of the program, and the virtual machine uses the interpreter to work with one of the compilers by default.
Java7 introduces hierarchical compilation, which combines the startup performance advantages of C1 and the peak performance advantages of C2. We can also force the real-time compilation mode of the virtual machine through the parameter-client or-server.
Hierarchical compilation divides the execution status of JVM into five levels:
> layer 0: program interpretation and execution, performance monitoring function (Profiling) is enabled by default, if not enabled, layer 2 compilation can be triggered; > > layer 1: can be called C1 compilation, which compiles bytecode into native code for simple and reliable optimization, does not open Profiling; > > layer 2: also known as C1 compilation, turns on Profiling, and only performs C1 compilation with the number of method calls and the number of loop-back execution times profiling Layer 3: also known as C1 compilation, performs all C1 compilations with Profiling; layer 4: can be referred to as C2 compilation, which also compiles bytecode to native code, but enables some optimizations that take a long time to compile, and even makes some unreliable radical optimizations based on performance monitoring information.
For the three states of C1, the execution efficiency is from high to low: layer 1, layer 2, layer 3.
In general, C2 is more than 30% more efficient than C1.
In Java8, hierarchical compilation is turned on by default, and the settings of-client and-server are no longer valid. If you only want to turn on C2, you can turn off hierarchical compilation (- XX:-TieredCompilation), and if you only want to use C1, you can turn on hierarchical compilation with the parameter-XX:TieredStopAtLevel=1.
You can view the compilation mode used by the current system directly from the java-version command line:
C:\ Users\ Administrator > java-versionjava version "1.8.0mm 45" Java (TM) SE Runtime Environment (build 1.8.0_45-b14) Java HotSpot (TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
Mixed mode represents the default mixed compilation mode, in addition to this mode, we can also use the-Xint parameter to force the virtual machine to run in the compiled mode with only the interpreter, when JIT is not involved at all; you can also use the parameter-Xcomp to force the virtual machine to run in the compiled mode with only JIT. For example:
C:\ Users\ Administrator > java-Xint-versionjava version "1.8.0y45" Java (TM) SE Runtime Environment (build 1.8.0_45-b14) Java HotSpot (TM) 64-Bit Server VM (build 25.45-b02, interpreted mode) C:\ Users\ Administrator > java-Xcomp-versionjava version "1.8.0y45" Java (TM) SE Runtime Environment (build 1.8.0_45-b14) Java HotSpot (TM) 64-Bit Server VM (build 25.45-b02, compiled mode) trigger standard
In HotSpot virtual machine, hotspot detection is the trigger standard of JIT.
> Hot spot detection is a hot spot detection based on counters. A virtual machine using this method establishes a counter to count the number of times the method is executed for each method. If the number of times of execution exceeds a certain threshold, it is considered a "hot method".
The virtual machine prepares two types of counters for each method: the method call counter (Invocation Counter) and the backtracking counter (Back Edge Counter). On the premise of determining the running parameters of the virtual machine, both counters have a certain threshold, and when the counter exceeds the threshold, JIT compilation will be triggered.
Method call counter
The method call counter is used to count the number of method calls. The default threshold is 1500 in C1 mode and 10000 in C2 mode, which can be set by-XX: CompileThreshold. In the case of hierarchical compilation, the threshold specified by-XX: CompileThreshold will be invalid and will be dynamically adjusted according to the current number of methods to be compiled and the number of compilation threads. The JIT compiler is triggered when the sum of the method counter and the backside counter exceeds the method counter threshold.
Back counter
The Back Edge counter is used to count the number of times the loop code executes in a method. The instruction that controls the flow direction in the bytecode and jumps is called "backedge". This value is used to calculate the threshold of whether C1 compilation is triggered. When hierarchical compilation is not enabled, C1 defaults to 13995 C2 and 10700 by default, which can be set through-XX: OnStackReplacePercentage=N. In the case of hierarchical compilation, the threshold specified by-XX: OnStackReplacePercentage will also fail, and will be dynamically adjusted according to the current number of methods to be compiled and the number of compiled threads.
The main purpose of setting up a backtracking counter is to trigger OSR (On StackReplacement) compilation, that is, stack compilation. In some code segments with a long cycle, when the loop reaches the threshold of the backedge counter, JVM will think that this is hot code, and the JIT compiler will compile the code into the machine language and cache it. During the cycle period, it will directly replace the execution code and execute the cached machine language.
Optimization technology
JIT compilation uses some classical compilation optimization techniques to optimize the code, that is, through some routine check optimization, you can intelligently compile the optimal performance code at run time. There are two main methods: inline and escape analysis.
Method inline
Calling a method usually goes through stacking and de-stacking. The calling method transfers the execution order of the program to the memory address where the method is stored, and after the content of the method is executed, it returns to the location before the method is executed.
This kind of execution operation requires to protect the site and remember the address of the execution before execution, restore the site after execution, and continue to execute at the original saved address. As a result, method calls incur some time and space overhead (which can actually be understood as a stripped-down version of context switching).
Then for those methods whose body code is not very large and are called frequently, this time and space will be very expensive.
The optimization behavior of method inlining is to copy the code of the target method into the method that initiates the call to avoid the actual method call.
JVM automatically identifies hot methods and optimizes them using method inlining. We can set the threshold of the hotspot method through-XX:CompileThreshold. However, it is important to emphasize that hot methods are not necessarily optimized inline by JVM. If the method body is too large, JVM will not perform inline operations. The size threshold of the method body can also be optimized by setting parameters:
Methods that are often executed will be inlined by default if the body size of the method is less than 325 bytes. We can set the size value through-XX:MaxFreqInlineSize=N
It is not a method that is often executed. By default, the method size is less than 35 bytes before it is inlined. We can also reset the size value with-XX:MaxInlineSize=N.
We can then configure the JVM parameter to see how the method is inlined:
/ / print compilation process information on the console-XX:+PrintCompilation// unlocks the option parameters for diagnosing JVM. It is off by default. When enabled, some specific parameters are supported to diagnose JVM-XX:+UnlockDiagnosticVMOptions// prints the inline method-XX:+PrintInlining
The optimization of hot methods can effectively improve system performance. Generally speaking, we can improve the inline of methods in the following ways:
Reduce the hotspot threshold or increase the method body threshold by setting the JVM parameter so that more methods can be inlined, but this method means more memory consumption
In programming, avoid writing a lot of code in one method and get used to using small methods.
Try to use final, private, static keyword modification methods, coding methods because of inheritance, will require additional type checking.
> this is related to the initial point of view that the less content in a method, when the method is often executed, it is easy to inline the method to optimize performance.
Escape analysis
Escape analysis (Escape Analysis) is an analysis technique to determine whether an object is referenced by an external method or accessed by an external thread. The compiler optimizes the code based on the results of the escape analysis.
You can set it through the JVM parameter:
-XX:+DoEscapeAnalysis on Escape Analysis (jdk1.8 is on by default)-XX:-DoEscapeAnalysis off Escape Analysis
There are three main optimization methods: stack allocation, lock elimination and scalar replacement.
Stack allocation
By default, creating an object in Java allocates memory in the heap, but when objects in heap memory are no longer used, they need to be reclaimed through a garbage collection mechanism, which is more time-consuming and performance-consuming than the creation and destruction of objects allocated in the stack.
At this point, if escape analysis finds that an object is used only in a method, it allocates the object on the stack.
However, the current implementation of the HotSpot virtual machine leads to the complexity of the implementation of allocation on the stack. It can be said that this optimization has not been implemented in HotSpot for the time being, so you may not be able to understand this optimization for the time being. (the information I read shows that it has not been implemented in Java8 yet. If you have any other discoveries, please leave a message.
Lock elimination
If you are in a single-threaded environment, it is not necessary to use a thread-safe container, but even if you do, because there will be no thread competition, JIT compilation will remove the lock on the object's method lock. For example:
Public static String getString (String S1, String S2) {StringBuffer sb = new StringBuffer (); sb.append (S1); sb.append (S2); return sb.toString ();}
You can set it through the JVM parameter:
-XX:+EliminateLocks open lock elimination (jdk1.8 default is on)-XX:-EliminateLocks close lock eliminates scalar substitution
Escape analysis proves that an object will not be accessed externally, and if the object can be split, it may not be created when the program is actually executed, but its member variables may be created instead. After the object is split, the member variables of the object can be allocated on the stack or register, so that the original object does not need to allocate memory space. This compilation optimization is called scalar substitution.
For example:
Public void foo () {TestInfo info = new TestInfo (); info.id = 1; info.count = 99; / / to do something}
After escape analysis, the code is optimized to:
Public void foo () {id = 1; count = 99; / / to do something}
You can set it through the JVM parameter:
-XX:+EliminateAllocations enables scalar substitution (jdk1.8 is enabled by default)-XX:-EliminateAllocations is turned off, and the study on "how to understand real-time compilation of Java" is over. I hope to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.