In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-30 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article introduces the relevant knowledge of "how to use Java's collection function HashMap". In the operation process of actual cases, many people will encounter such difficulties. Next, let Xiaobian lead you to learn how to deal with these situations! I hope you can read carefully and learn something!
overview
① Array + linked list + red-black tree implementation. It is mainly used to process data with key-value pair characteristics.
② When the length of the linked list is greater than the threshold (or the boundary value of the red-black tree, the default is 8) and the length of the current array is greater than 64, all data at this index position is stored in the red-black tree instead.
③ Supplement: Before converting the linked list into a red-black tree, it will be judged that even if the threshold is greater than 8, but the array length is less than 64, at this time, the linked list will not be changed into a red-black tree, but the array expansion will be selected.
Each Node stores the hash value used to locate the data index position, the K key, the V value, and the Node next node pointing to the next node in the list.
Node is an internal class of HashMap, which implements the Map.Entry interface and is essentially a key-value pair.
The purpose of this is because the array is relatively small, try to avoid the red-black tree structure, in this case into the red-black tree structure, but will reduce efficiency, because the red-black tree needs to be left-handed, right-handed, color change These operations to maintain balance. When the array length is less than 64, the search time is relatively fast. Therefore, in order to improve performance and reduce search time, the linked list is converted to a red-black tree when the underlying threshold is greater than 8 and the array length is greater than 64.
important parameter
Capacity and Load Factor
Initial capacity: Capacity is the number of buckets in the hash table, and initial capacity is the capacity when the hash table is created.
Load factor: Load factor is a measure of how full a hash table is allowed to be before it automatically increases capacity. Default 0.75
threshold: threshold indicates the threshold value of the key-value pair that can be accommodated. The formula is array length * load factor.
size: size is the actual number of key-value pairs in the hashmap.
modCount: Used to record the number of times the internal structure of the hashmap has changed.
Implementation of put function
General idea:
hash hashCode() of key, and then calculate index;
If there is no collision, put it directly into the bucket;
If there is a collision, there are buckets in the form of a linked list;
if collision causes the linked list to be too long (TREEIFY_THRESHOLD is greater than or equal to TREEIFY_THRESHOLD ), converting the linked list into a red black tree;
Replace old value if node already exists (ensure key uniqueness)
If the bucket is full (exceeding load factor*current capacity ), resize it.
Implementation of the get function
General idea:
The first node in the bucket, hit directly;
If there is a conflict, search the corresponding entry by key.equals(k). If it is a tree, search the tree by key.equals(k), O(logn); if it is a linked list, search the linked list by key.equals(k), O(n).
hash function implementation//high 16 bits unchanged, low 16 bits and high 16 bits do an exclusive OR static final int hash(Object key) { int h; return (key == null) ? 0 : (h = key.hashCode()) ^ (h >>> 16);}
There are basically two steps to getting the elements of HashMap:
1. First hash according to hashCode(), and then determine the bucket index;
2. If the key of the bucket node is not what we need, we will find it in the linked list (red-black tree) by keys.equals().
Implementation of RESIZE
When put, resize occurs if it is found that the current bucket occupancy exceeds the proportion expected by the Load Factor.
In the process of resizing, simply speaking, it is to expand the bucket to 2 times, and then heavy.
New index is calculated and the node is placed in a new bucket. The position of an element is either in its original position or shifted to a power of two. Eliminates the time to recalculate hash values and scatters previously conflicting nodes into new buckets
When will HashMap be used? What's his signature?
HashMap is an implementation based on the Map interface. When storing key-value pairs, it can receive null key values. It is asynchronous. HashMap stores Entry(hash, key, value, next) objects.
Do you know how HashMap works? **
Store and retrieve objects by putting and getting through hash methods. When storing objects, we pass K/V to the put method, which calls hashCode to calculate hash to get bucket location, further storage, HashMapJava collection-HashMap will automatically adjust the capacity according to the current bucket occupancy (more than Load Facotr will resize to 2 times the original). When getting the object, we pass K to get, which calls hashCode to compute the hash to get the bucket location, and further calls the equals() method to determine the key-value pair. If collision occurs, Hashmap organizes collision collision elements through linked lists. In Java 8, if collision elements in a bucket exceed a certain limit (default is 8), a red-black tree is used to replace the linked list, thus improving speed.
Do you know how to get and put? What are the functions of equals() and hashCode()?
By hashing the hashCode() of the key and computing the subscript ( (n-1) & hash ), the position of the buckets is obtained. If there is a collision, use the key.equals() method to find the corresponding node in the linked list or tree.
Hash implementation, why should it be implemented like this?
In the implementation of Java 1.8, it is implemented by the high 16 bits of hashCode() or the low 16 bits: (h =k.hashCode()) ^ (h >> 16), mainly from the speed, efficiency, quality to consider, this can be done when the n of the bucket is relatively small, but also to ensure that the high and low bits are involved in the hash calculation, while not having too much overhead.
What if the size of the HashMap exceeds the capacity defined by the load factor?
If the load factor is exceeded (0.75 by default), a HashMap twice as long is resized and the hash method is invoked again.
"Java collection function HashMap how to use" the content is introduced here, thank you for reading. If you want to know more about industry-related knowledge, you can pay attention to the website. Xiaobian will output more high-quality practical articles for everyone!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.