Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Method steps for using Java thread pool

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/03 Report--

This article mainly introduces "the methods and steps of using Java thread pool". In the daily operation, I believe that many people have doubts about the methods and steps of using Java thread pool. The editor consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful for you to answer the doubts about the methods and steps of using Java thread pool. Next, please follow the editor to study!

1. Why use thread pools

1. Frequent creation and destruction of individual threads, a waste of resources, and frequent GC

two。 Lack of unified management, threads compete with each other

2.ThreadPoolExecutor

ThreadPoolExecutor has four overloaded constructors. Let's talk about the overloaded constructor with the most parameters, so that you know the meaning of other method parameters, as follows:

Public ThreadPoolExecutor (int corePoolSize

Int maximumPoolSize

Long keepAliveTime

TimeUnit unit

BlockingQueue workQueue

ThreadFactory threadFactory

RejectedExecutionHandler handler)

The parameters are described in detail:

Here are seven parameters (we use more of the construction method of five parameters in development), OK, so let's take a look at the meaning of the seven parameters:

Number of core threads in the corePoolSize thread pool

Maximum number of threads in the maximumPoolSize thread pool

The timeout of keepAliveTime non-core threads will be reclaimed when the idle time of non-core threads in the system exceeds keepAliveTime. If the allowCoreThreadTimeOut property of ThreadPoolExecutor is set to true, this parameter also indicates the timeout of the core thread

The unit of the third parameter of unit, such as nanosecond, microsecond, millisecond, second, minute, hour, day, etc.

A task queue in the workQueue thread pool that is used primarily to store tasks that have been committed but have not yet been executed. The tasks stored here are submitted by the execute method of ThreadPoolExecutor.

ThreadFactory provides the ability to create new threads for the thread pool, which we generally use by default

Handler reject policy, when a thread cannot perform a new task (usually because the maximum number of threads in the thread pool has been reached or the thread pool is closed), by default, a RejectedExecutionException is thrown when the thread pool is unable to handle new threads.

WorkQueue introduction

1.ArrayBlockingQueue: this means that a constructor of a specified size BlockingQueue,ArrayBlockingQueue accepts data of type int, which represents the size of the BlockingQueue, and the elements stored in the ArrayBlockingQueue are accessed in a FIFO (first-in, first-out) manner.

2.LinkedBlockingQueue: this represents a BlockingQueue of uncertain size. You can pass data of type int in the construction method of LinkedBlockingQueue. In this way, the created LinkedBlockingQueue has size or does not pass. If not, the size of LinkedBlockingQueue is Integer.MAX_VALUE, and the source code is as follows:

3.PriorityBlockingQueue: this queue is similar to LinkedBlockingQueue, except that the elements in PriorityBlockingQueue are not sorted by FIFO, but the access order is determined by the Comparator of the elements (this function also reflects that the data stored in PriorityBlockingQueue must implement the Comparator interface).

4.SynchronousQueue: this is synchronous Queue, a kind of thread-safe BlockingQueue. In SynchronousQueue, the insert operation of the producer thread has to wait for the removal operation of the consumer thread. There is no data cache space inside the Synchronous, so we cannot read or traverse the data in the SynchronousQueue. Elements are only possible when you try to take them away. We can understand that producers and consumers wait for each other, wait for each other and then leave together.

Rejection strategy

AbortPolicy: simply reject and throw an exception, which is also the default policy.

CallerRunsPolicy: directly let the thread calling the execute method perform this task.

DiscardOldestPolicy: discard the oldest outstanding task, and then try to execute the current new task again.

DiscardPolicy: discards the current task directly, but does not throw an exception

3. Execution process

When the number of threads does not reach corePoolSize, a new thread is created to execute the task.

When the number of core threads is full, the task is placed in the blocking queue.

When the queue is full and the maximum number of threads is not reached, a new non-core thread is created to execute the task (important).

When the queue is full and the maximum number of threads is reached, a reject policy is selected to execute.

4. Other thread pools

1.FixedThreadPool

For a fixed size thread pool, you can specify the size of the thread pool, where the thread pool corePoolSize is equal to maximumPoolSize, and the blocking queue uses LinkedBlockingQueue, which is the maximum integer size.

The number of threads in this thread pool is always the same, and when a new task is submitted, idle threads in the thread pool will execute immediately, and if not, it will be temporarily saved to the blocking queue. For a fixed-size thread pool, there is no change in the number of threads.

At the same time, unbounded LinkedBlockingQueue is used to store the tasks executed. When tasks are submitted frequently, the LinkedBlockingQueue increases rapidly, and there is a problem of exhausting system resources.

And when the thread pool is idle, that is, when there are no runnable tasks in the thread pool, it will not release the worker thread, it will take up some system resources, and it will need shutdown.

2.SingleThreadExecutor

You can see that the blocking team example uses LinkedBolckingQueue, and the default size is Integer.MAX_VALUE, so that if a large number of requests come, they will be placed in this task queue, which may lead to OOM.

3.Executors.newCachedThreadPool ()

Thread pool can be cached. Check to see if there are any previously established threads in the thread pool. If so, use it directly. If you do not create a new thread to join the thread pool, you can cache the thread pool.

It is usually used to perform asynchronous tasks with a short lifetime; the thread pool is infinite, and when the last task is completed when the current task is executed, the thread executing the previous task is reused instead of creating a new thread each time.

The cached thread lives for 60 seconds by default. The core pool corePoolSize size of the thread is 0, the maximum core pool is Integer.MAX_VALUE, and the blocking queue uses SynchronousQueue.

Is a directly committed blocking queue that always forces the thread pool to add new threads to perform new tasks.

When there is no task execution, when the idle time of the thread exceeds keepAliveTime (60 seconds), the worker thread will terminate being reclaimed and when a new task is submitted

If there are no idle threads, creating a new thread to perform the task will result in some system overhead.

If a large number of tasks are submitted at the same time, and the task execution time is not particularly fast, then the thread pool will add the same amount of thread pool to process the task, which is likely to quickly deplete the resources of the system.

4.ScheduledThreadPool

Create a fixed-length thread pool that supports scheduled and periodic task execution

Timed thread pool, which can be used to perform tasks periodically, usually for periodic synchronization of data.

ScheduleAtFixedRate: a task is executed at a fixed frequency, and a cycle is the interval between the successful execution of each task.

SchedultWithFixedDelay: a task is executed with a fixed delay, which refers to the time after the last successful execution and before the next execution.

5. Why Ali recommends a custom thread pool

Through the above source code analysis, we find that both newFixedThreadPool and newSingleThreadExecutor methods use LinkedBlockingQueue's task queue, and the default size of LinkedBlockingQueue is Integer.MAX_VALUE. The thread pool size defined in newCachedThreadPool is Integer.MAX_VALUE.

Therefore, the reason why Ali forbids the use of Executors to create thread pools is that the request queue length of FixedThreadPool and SingleThreadPool is Integer.MAX_VALUE, which may pile up a large number of requests, resulting in OOM.

The number of creation threads allowed by CachedThreadPool is Integer.MAX_VALUE, which may create a large number of threads, resulting in OOM.

6. Other

1.shutDown () closes the thread pool without affecting tasks that have been submitted

2.shutDownNow () closes the thread pool and attempts to terminate the executing thread

3.allowCoreThreadTimeOut (boolean value) allows core threads to be reclaimed during idle timeout

4. Singleton mode creates thread pool

Import com.google.common.util.concurrent.ThreadFactoryBuilder

Import java.util.concurrent.*

/ * *

* Asynchronous task processor

, /

Public class AsyncTaskExecutor {

/ * * number of threads keeping ALIVE state in the thread pool * /

Public static final int CORE_POOL_SIZE = 10

/ * * maximum number of threads in thread pool * /

Public static final int MAX_POOL_SIZE = 40

/ * * idle thread recovery time * /

Public static final int KEEP_ALIVE_TIME = 1000

/ * * Thread pool waiting queue * /

Public static final int BLOCKING_QUEUE_SIZE = 1000

/ * * Asynchronous processing thread pool for business requests * /

Private static final ThreadPoolExecutor processExecutor = new ThreadPoolExecutor (

CORE_POOL_SIZE, MAX_POOL_SIZE, KEEP_ALIVE_TIME, TimeUnit.MICROSECONDS

New LinkedBlockingQueue (BLOCKING_QUEUE_SIZE)

New TreadFactoryBuilder.setNameFormat (boomoom-thread-pool-%d) .build ()

New TreadPoolExecutor.DiscardPolicy ()

Private AsyncTaskExecutor () {}

/ * *

* Asynchronous task processing

*

* @ param task task

, /

Public void execute (Runnable task) {

ProcessExecutor.submit (task)

}

}

The difference between lazy and hungry Chinese

1. The hungry Chinese style is thread-safe, and a static object has been created for the system to use at the same time as the class is created, and will not be changed in the future.

two。 Lazy type if you want thread safety, you must use a double check lock and the object must also be a volatile to prevent object instruction rearrangement.

At this point, the study on the "method steps for using Java thread pool" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report