Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Concurrency containers commonly used in Java

2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly explains the "concurrency containers commonly used in Java". The content of the explanation is simple and clear, and it is easy to learn and understand. Please follow the editor's train of thought to study and learn the concurrency containers commonly used in Java.

Introduction to concurrent containers

ConcurrentHashMap: concurrent version of HashMap

CopyOnWriteArrayList: concurrent version of ArrayList

CopyOnWriteArraySet: concurrent Set

ConcurrentLinkedQueue: concurrent queues (based on linked lists)

ConcurrentLinkedDeque: concurrent queues (based on bidirectional linked lists)

ConcurrentSkipListMap: concurrent Map based on jump table

ConcurrentSkipListSet: concurrent Set based on jump table

ArrayBlockingQueue: blocking queue (array based)

LinkedBlockingQueue: blocking queue (based on linked list)

LinkedBlockingDeque: blocking queue (based on bidirectional linked list)

PriorityBlockingQueue: thread-safe priority queue

SynchronousQueue: read-write queue

LinkedTransferQueue: data exchange queue based on linked list

DelayQueue: delay queue

1.ConcurrentHashMap concurrent version HashMap

One of the most common concurrency containers can be used as a cache in concurrency scenarios. The underlying hash table is still a hash table, but there has been a big change in JAVA 8, while JAVA 7 and JAVA 8 are both more commonly used versions, so it is common to compare the implementation of the two versions (for example, in an interview).

A big difference is that JAVA 7 uses segmented locks to reduce lock competition, while JAVA 8 abandons segmented locks and adopts CAS (an optimistic lock). At the same time, in order to prevent a serious hash conflict from degenerating into a linked list (where objects with the same hash value are linked together), it will be converted to a red-black tree (compared to the linked list) when the length of the list reaches the threshold (8). The query efficiency of the tree is more stable.

2.CopyOnWriteArrayList concurrent version ArrayList

The underlying structure of the concurrent version of ArrayList is also an array, which is different from ArrayList in that when elements are added or deleted, a new array is created, specified objects are added or excluded in the new array, and the original array is replaced with the new array.

Applicable scenarios: since read operations are not locked and write (add, delete, change) operations are locked, it is suitable for scenarios with more reads and less writes.

Limitation: because there is no lock when reading (efficient reading, just like normal ArrayList), the current copy of the read is read, so dirty data may be read. If you mind, I suggest not.

Take a look at the source code and feel:

Public class CopyOnWriteArrayList implements List, RandomAccess, Cloneable, java.io.Serializable {final transient ReentrantLock lock = new ReentrantLock (); private transient volatile Object [] array; / / add elements, lock public boolean add (E e) {final ReentrantLock lock = this.lock; lock.lock (); / / add locks when modified to ensure concurrency security try {Object [] elements = getArray () / / current array int len = elements.length; Object [] newElements = Arrays.copyOf (elements, len + 1); / / create a new array with a space larger than the old newElements [len] = e; / / put the elements to be added into the new array setArray (newElements); / / replace the original array return true with the new array } finally {lock.unlock (); / / unlock}} / / read elements without locking, so it is possible to read old data public E get (int index) {return get (getArray (), index);}} 3.CopyOnWriteArraySet concurrent Set

Based on the CopyOnWriteArrayList implementation (which contains a CopyOnWriteArrayList member variable), that is, the underlying is an array, which means that every time the add has to traverse the entire collection to know if it exists, it needs to be inserted (locked) when it does not exist.

Applicable scenarios: add one under the CopyOnWriteArrayList application scenario, so that the collection is not too large (you can't afford to go through all the time).

4.ConcurrentLinkedQueue concurrent queues (based on linked lists)

Based on the concurrent queue implemented by linked list, optimistic lock (CAS) is used to ensure thread safety. Because the data structure is a linked list, there is theoretically no queue size limit, which means that adding data must be successful.

5.ConcurrentLinkedDeque concurrent queues (based on bidirectional linked lists)

Concurrent queues based on two-way linked lists can operate on the head and tail respectively, so in addition to first-in, first-out (FIFO), there can also be first-in-first-out (FILO). Of course, it should be called stack if it is first-in-first-out.

6.ConcurrentSkipListMap concurrent Map based on Jump Table

SkipList is a hopping table, which is a data structure of changing space for time. Through redundant data, the linked list is indexed layer by layer, which is similar to the effect of binary search.

7.ConcurrentSkipListSet concurrent Set based on Jump Table

Similar to the relationship between HashSet and HashMap, there is a ConcurrentSkipListMap in ConcurrentSkipListSet, so I won't go into details.

8.ArrayBlockingQueue blocking queue (array based)

Based on the array implementation of the blocking queue, the construction must set the size of the array, put something in it if the array is full, it will block until there is a location (also support direct return and timeout waiting), through a lock ReentrantLock to ensure thread safety.

Use the offer operation as an example:

Public class ArrayBlockingQueue extends AbstractQueue implements BlockingQueue, java.io.Serializable {/ * read and write share this lock, and threads communicate through the following two Condition * these two Condition and lock are closely related (generated by the lock method) * signal that the wait/notify * / final ReentrantLock lock; / * queue similar to Object is not empty, and the thread fetching the data needs to pay attention to * / private final Condition notEmpty / * * when the queue is not full, the thread writing the data needs to pay attention to * / private final Condition notFull; / / blocking until something can be taken out public E take () throws InterruptedException {final ReentrantLock lock = this.lock; lock.lockInterruptibly (); try {while (count = = 0) notEmpty.await (); return dequeue () } finally {lock.unlock ();}} / / insert an element at the end, wait for a specified time when the queue is full, and return public boolean offer (E, long timeout, TimeUnit unit) throws InterruptedException {checkNotNull (e); long nanos = unit.toNanos (timeout); final ReentrantLock lock = this.lock; lock.lockInterruptibly () / / Lock try {/ / wait in a loop until the queue is free while (count = = items.length) {if (nanos {try {/ / without rest, frantically write for (int I = 0); Catch +) {System.out.println ("put:" + I); queue.put (I);}} catch (InterruptedException e) {e.printStackTrace ();}}. Start () New Thread (()-> {try {/ / salted fish mode fetch data while (true) {System.out.println ("take out:" + queue.take ()); Thread.sleep ((long) (Math.random () * 2000) }} catch (InterruptedException e) {e.printStackTrace ();}}) .start ();}} / * output: put: 0 take out: 0 put: 1 take out: 1 put in: 2 take out: 2 put in: 3 take out: 3 take out /

As you can see, the writing thread does not have any sleep, so it can be said that it is trying its best to put something into the queue, while the reading thread is not very active, reading again and again for a while. The result of the output is read and write operations in pairs.

One usage scenario in JAVA is Executors.newCachedThreadPool (), which creates a cache thread pool.

Public static ExecutorService newCachedThreadPool () {return new ThreadPoolExecutor (0, / / Core thread is 0, useless threads are mercilessly abandoned Integer.MAX_VALUE, / / the maximum number of threads is theoretically unlimited, before reaching this value, the machine resources are hollowed out 60L, TimeUnit.SECONDS, / / idle threads are destroyed new SynchronousQueue () 60 seconds later) / / in offer, if there is no idle thread to retrieve the task, it will fail, and the thread pool will create a new thread} 13.LinkedTransferQueue linked list-based data exchange queue.

The interface TransferQueue is implemented. When you put an element through the transfer method, if you find that a thread is blocking the element, it will give the element directly to the waiting thread. If no one is waiting to consume, the element is placed at the end of the queue, and this method blocks until someone reads the element. Similar to SynchronousQueue, but more powerful than it.

14.DelayQueue delay queue

You can make the queued element be taken out by the consumer after a specified delay, and the element needs to implement the Delayed interface.

Thank you for your reading, the above is the content of "concurrency containers commonly used in Java". After the study of this article, I believe you have a deeper understanding of the problem of concurrency containers commonly used in Java, and the specific use needs to be verified in practice. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report