HashMap source code analysis, there are 8 reasons for JVM memory overflow

this.loadFactor = loadFactor;

//Returns the power of 2

this.threshold = tableSizeFor(initialCapacity);

}

Copy code



For the above constructor, we need to pay attention to`this.threshold = tableSizeFor(initialCapacity);`This way threshold To the power of 2, not`capacity * load factor`,Of course, this is not a mistake, because at this time table It was not really initialized, and the initialization action was delayed to`putVal()`Middle, so threshold Will be recalculated.



/**

  • Initializes an empty HashMap instance based on the specified capacity and the default load factor (0.75)

  • If initCapacity is negative, an IllegalArgumentException will be thrown

*/

public HashMap(int initialCapacity) {

this(initialCapacity, DEFAULT_LOAD_FACTOR);

}

/**

  • Initialize an empty HashMap instance based on the default capacity and load factor

*/

public HashMap() {

this.loadFactor = DEFAULT_LOAD_FACTOR; // all other fields defaulted

}

/**

  • Constructs a new HashMap with the same mappings as the

  • specified Map. The HashMap is created with

  • default load factor (0.75) and an initial capacity sufficient to

  • hold the mappings in the specified Map.

  • @param m the map whose mappings are to be placed in this map

  • @throws NullPointerException if the specified map is null

*/

public HashMap(Map<? extends K, ? extends V> m) {

this.loadFactor = DEFAULT_LOAD_FACTOR;

putMapEntries(m, false);

}

Copy code



### query



/**

  • Returns the value value corresponding to the specified key. When the specified key does not exist, null is returned.

  • When null is returned, it does not indicate that there is no mapping of this relationship in the hash table. It is possible that the corresponding value of the specified key is null.

  • Therefore, these two cases can be distinguished by containsKey.

*/

public V get(Object key) {

Node<K,V> e;

return (e = getNode(hash(key), key)) == null ? null : e.value;

}

/**

  • 1. First, find the hash bucket of the key through its hash value

  • 2. For the hash bucket where the key is located, there is only one element, which is the node corresponding to the key,

  • 3. For more than one node in the hash bucket where the key is located, there are two situations:

  • If this is a TreeNode,Indicates that it is stored in the red black tree and found in the red black tree
    
  • If not a TreeNode,Indicates that it can be found in the linked list through linked list storage (chain address method)
    
  • 4. If the corresponding key cannot be found, null is returned

*/

final Node<K,V> getNode(int hash, Object key) {

Node<K,V>[] tab; Node<K,V> first, e; int n; K k;

if ((tab = table) != null && (n = tab.length) > 0 &&

    (first = tab[(n - 1) & hash]) != null) {

    if (first.hash == hash && // always check first node

        ((k = first.key) == key || (key != null && key.equals(k))))

        return first;

    if ((e = first.next) != null) {

        if (first instanceof TreeNode)

            return ((TreeNode<K,V>)first).getTreeNode(hash, key);

        do {

            if (e.hash == hash &&

                ((k = e.key) == key || (key != null && key.equals(k))))

                return e;

        } while ((e = e.next) != null);

    }

}

return null;

}

Copy code



### storage



/**

  • In the mapping, associate the specified key with the specified value. If the mapping relationship already has the specified key, the old value will be replaced

*/

public V put(K key, V value) {

return putVal(hash(key), key, value, false, true);

}

/**

    • @param onlyIfAbsent if true, don't change existing value
  • 1. Judge whether the hash table table is empty. If yes, expand the capacity

  • 2. According to the hash bucket array index calculated by the key, if table[i] is empty, create a new node directly

  • 3. Judge whether the first element of table[i] is equal to key. If yes, update the old value value

  • 4. Judge whether table[i] is a TreeNode. If yes, it is a red black tree. Insert it directly into the tree

  • 5. Traverse the table[i]. During the traversal process, it is found that the key already exists, and the old value value is updated. Otherwise, the insertion operation is carried out. After the insertion, it is found that the length of the linked list is greater than 8, then the linked list is converted into a red black tree

*/

final V putVal(int hash, K key, V value, boolean onlyIfAbsent,

           boolean evict) {

Node<K,V>[] tab; Node<K,V> p; int n, i;

//The hash table table is empty. Expand the capacity

if ((tab = table) == null || (n = tab.length) == 0)

    n = (tab = resize()).length;

// tab[i] is empty. Create a new node directly

if ((p = tab[i = (n - 1) & hash]) == null)

    tab[i] = newNode(hash, key, value, null);

else {

    Node<K,V> e; K k;

    //tab[i] the first element is the key, which updates the old value

    if (p.hash == hash &&

        ((k = p.key) == key || (key != null && key.equals(k))))

        e = p;

    //The current node is a TreeNode, which is inserted in the red black tree

    else if (p instanceof TreeNode)

        e = ((TreeNode<K,V>)p).putTreeVal(this, tab, hash, key, value);

    else {

        //Traverse the tab[i], the key already exists, update the old value value, otherwise enter the insertion operation, the length of the linked list after insertion is greater than 8, and convert the linked list into a red black tree

        for (int binCount = 0; ; ++binCount) {

            if ((e = p.next) == null) {

                p.next = newNode(hash, key, value, null);

                //The length of the linked list is greater than 8

                if (binCount >= TREEIFY_THRESHOLD - 1) // -1 for 1st

                    treeifyBin(tab, hash);

                break;

            }

            // key already exists

            if (e.hash == hash &&

                ((k = e.key) == key || (key != null && key.equals(k))))

                break;

            p = e;

        }

    }

    //key already exists. Update the old value

    if (e != null) { // existing mapping for key

        V oldValue = e.value;

        if (!onlyIfAbsent || oldValue == null)

            e.value = value;

        afterNodeAccess(e);

        return oldValue;

    }

}

//The HashMap insert element indicates a structural adjustment

++modCount;

//If the number of actual key value pairs exceeds the threshold, perform capacity expansion

if (++size > threshold)

    resize();

afterNodeInsertion(evict);

return null;

}

Copy code



### Capacity expansion



/**

  • Initialize or expand the hash table. If the current hash table is empty, it is allocated according to the initial capacity in the field threshold.

  • Otherwise, because we expand the capacity twice, the elements in the bucket will either be in the original position or move to the power of 2 in the original position.

*/

final Node<K,V>[] resize() {

Node<K,V>[] oldTab = table;

int oldCap = (oldTab == null) ? 0 : oldTab.length;

int oldThr = threshold;

int newCap, newThr = 0;

if (oldCap > 0) {

    //If the maximum capacity is exceeded, it will not be expanded

    if (oldCap >= MAXIMUM_CAPACITY) {

        threshold = Integer.MAX_VALUE;

        return oldTab;

    }

    //The capacity did not exceed the maximum value, and the capacity doubled

    else if ((newCap = oldCap << 1) < MAXIMUM_CAPACITY &&

             oldCap >= DEFAULT_INITIAL_CAPACITY)

        //The threshold is doubled

        newThr = oldThr << 1; // double threshold

}

else if (oldThr > 0) // initial capacity was placed in threshold

    //The parameter constructor of HashMap is called, and the initial capacity is replaced by threshold,

    //In the constructor with parameters, the value of threshold is the return value of tableSizeFor(), that is, the power of 2, rather than capacity * load factor

    newCap = oldThr;

else {               // zero initial threshold signifies using defaults

    //Initial initialization, capacity and threshold use default values

    newCap = DEFAULT_INITIAL_CAPACITY;

    newThr = (int)(DEFAULT_LOAD_FACTOR * DEFAULT_INITIAL_CAPACITY);

}

if (newThr == 0) {

    //Calculate new threshold

    float ft = (float)newCap * loadFactor;

    newThr = (newCap < MAXIMUM_CAPACITY && ft < (float)MAXIMUM_CAPACITY ?

              (int)ft : Integer.MAX_VALUE);

}

threshold = newThr;

@SuppressWarnings({"rawtypes","unchecked"})

//The following are the key points in the expansion process

    Node<K,V>[] newTab = (Node<K,V>[])new Node[newCap];

table = newTab;

if (oldTab != null) {

    for (int j = 0; j < oldCap; ++j) {

        Node<K,V> e;

        if ((e = oldTab[j]) != null) {

            //Empty the original hash bucket for GC

            oldTab[j] = null;

            //The current node does not exist in the form of a linked list, and its new location should be calculated directly

            if (e.next == null)

                newTab[e.hash & (newCap - 1)] = e;

            //The current node is a TreeNode

            else if (e instanceof TreeNode)

                ((TreeNode<K,V>)e).split(this, newTab, j, oldCap);

            else { // preserve order

                //Nodes are stored as linked lists

                Node<K,V> loHead = null, loTail = null;

                Node<K,V> hiHead = null, hiTail = null;

                Node<K,V> next;

                do {

                    next = e.next;

                    //Original index

                    if ((e.hash & oldCap) == 0) {

                        if (loTail == null)

                            loHead = e;

                        else

                            loTail.next = e;

                        loTail = e;

                    }

                    //Original index + oldCap

                    else {

                        if (hiTail == null)

                            hiHead = e;

                        else

                            hiTail.next = e;

                        hiTail = e;

                    }

                } while ((e = next) != null);

                if (loTail != null) {

                    loTail.next = null;

                    newTab[j] = loHead;

                }

                if (hiTail != null) {

                    hiTail.next = null;

                    newTab[j + oldCap] = hiHead;

                }

            }

        }

    }

}

return newTab;

}

Copy code





# summary

From basic to advanced and then to actual combat, this paper goes from shallow to deep MySQL Speak clearly and plainly, this should be the best I have seen so far MySQL I believe that if you read this note carefully, you will be able to solve both the problems encountered at work and the questions asked by the interviewer!

**[CodeChina Open source project: [first tier big factory] Java Analysis of interview questions+Core summary learning notes+Latest explanation Video]](

)**

**MySQL50 Arrangement of high frequency interview questions:**

] = loHead;

                    }

                    if (hiTail != null) {

                        hiTail.next = null;

                        newTab[j + oldCap] = hiHead;

                    }

                }

            }

        }

    }

    return newTab;

}

Copy code

summary

This article goes from basic to advanced and then to actual combat. It makes MySQL clear and clear. This should be the best learning note about MySQL I have seen so far. I believe that if you carefully read this note, you can easily solve the problems encountered at work or asked by the interviewer!

**[CodeChina open source project: [analysis of Java interview questions of first-line large manufacturers + core summary learning notes + latest explanation Video]](

)**

Sorting of 50 high-frequency interview questions on MySQL:

Tags: Java data structure Back-end Programmer linked list

Posted on Thu, 09 Sep 2021 15:44:48 -0400 by ubersmuck