It's another pay rise season. The big guys exchange resources with each other, and the small dishes change pits with each other. Looking back on his more than four years of career, with various topics such as social development, 996, internal volume, programmer development, life goals and so on, bald ran wants to work hard and make more money to go home for transformation. So I sorted out some interview outlines online, cooperated with some blogs, and sorted out a review plan. At the same time, I sorted it into a blog to facilitate my follow-up review.
The whole plan is divided into ten parts (collection, multithreading, network, Spring and Mybatis, MySQL, JVM, Kafka, Redis, Zookeeper and distributed). If there are any omissions or errors, please correct them.
Let's first look at the interface implementation framework diagram of the collection
HashMap
sketch:
Key value pairs, out of order, thread unsafe
The bottom implementation is array + linked list + red black tree
put process:
public V put(K key, V value) { return putVal(hash(key), key, value, false, true); }
static final int hash(Object key) { int h; return (key == null) ? 0 : (h = key.hashCode()) ^ (h >>> 16); }
final V putVal(int hash, K key, V value, boolean onlyIfAbsent, boolean evict) { Node<K,V>[] tab; Node<K,V> p; int n, i; if ((tab = table) == null || (n = tab.length) == 0) n = (tab = resize()).length; if ((p = tab[i = (n - 1) & hash]) == null) tab[i] = newNode(hash, key, value, null); else { Node<K,V> e; K k; if (p.hash == hash && ((k = p.key) == key || (key != null && key.equals(k)))) e = p; else if (p instanceof TreeNode) e = ((TreeNode<K,V>)p).putTreeVal(this, tab, hash, key, value); else { for (int binCount = 0; ; ++binCount) { if ((e = p.next) == null) { p.next = newNode(hash, key, value, null); if (binCount >= TREEIFY_THRESHOLD - 1) // -1 for 1st treeifyBin(tab, hash); break; } if (e.hash == hash && ((k = e.key) == key || (key != null && key.equals(k)))) break; p = e; } } if (e != null) { // existing mapping for key V oldValue = e.value; if (!onlyIfAbsent || oldValue == null) e.value = value; afterNodeAccess(e); return oldValue; } } ++modCount; if (++size > threshold) resize(); afterNodeInsertion(evict); return null; }
In put flow chart:
resize expansion process:
final Node<K,V>[] resize() { Node<K,V>[] oldTab = table; int oldCap = (oldTab == null) ? 0 : oldTab.length; int oldThr = threshold; int newCap, newThr = 0; if (oldCap > 0) { if (oldCap >= MAXIMUM_CAPACITY) { threshold = Integer.MAX_VALUE; return oldTab; } else if ((newCap = oldCap << 1) < MAXIMUM_CAPACITY && oldCap >= DEFAULT_INITIAL_CAPACITY) newThr = oldThr << 1; // double threshold } else if (oldThr > 0) // initial capacity was placed in threshold newCap = oldThr; else { // zero initial threshold signifies using defaults newCap = DEFAULT_INITIAL_CAPACITY; newThr = (int)(DEFAULT_LOAD_FACTOR * DEFAULT_INITIAL_CAPACITY); } if (newThr == 0) { float ft = (float)newCap * loadFactor; newThr = (newCap < MAXIMUM_CAPACITY && ft < (float)MAXIMUM_CAPACITY ? (int)ft : Integer.MAX_VALUE); } threshold = newThr; @SuppressWarnings({"rawtypes","unchecked"}) Node<K,V>[] newTab = (Node<K,V>[])new Node[newCap]; table = newTab; if (oldTab != null) { for (int j = 0; j < oldCap; ++j) { Node<K,V> e; if ((e = oldTab[j]) != null) { oldTab[j] = null; if (e.next == null) newTab[e.hash & (newCap - 1)] = e; else if (e instanceof TreeNode) ((TreeNode<K,V>)e).split(this, newTab, j, oldCap); else { // preserve order Node<K,V> loHead = null, loTail = null; Node<K,V> hiHead = null, hiTail = null; Node<K,V> next; do { next = e.next; if ((e.hash & oldCap) == 0) { if (loTail == null) loHead = e; else loTail.next = e; loTail = e; } else { if (hiTail == null) hiHead = e; else hiTail.next = e; hiTail = e; } } while ((e = next) != null); if (loTail != null) { loTail.next = null; newTab[j] = loHead; } if (hiTail != null) { hiTail.next = null; newTab[j + oldCap] = hiHead; } } } } } return newTab; }
Capacity expansion flow chart:
Briefly describe two points:
- In the hash method (h = key. Hashcode()) ^ (h > > > 16) refers to the high 16 bits participating in the operation. In the front, the key is converted to hash, which becomes a binary number with a total length of 32 bits, and then assigned to h. in the back, the 32-bit hash value is shifted to the right by 16 bits as a whole. The two results do XOR operation to obtain a hash value in which both high and low bits participate.
- Index position (n-1) & hash. Before talking about this, first talk about the method of calculating the capacity when initializing the container of HashMap.
/** * Returns a power of two size for the given target capacity. */ static final int tableSizeFor(int cap) { int n = cap - 1; n |= n >>> 1; n |= n >>> 2; n |= n >>> 4; n |= n >>> 8; n |= n >>> 16; return (n < 0) ? 1 : (n >= MAXIMUM_CAPACITY) ? MAXIMUM_CAPACITY : n + 1; }
After entering cap, first subtract one (to prevent cap from being equal to the power of 2), then move unsigned right by 1 bit, and then move unsigned right by 2 bits Five operations, covering the 32-bit hash value, and finally + 1 to get the final container size, that is, the power of 2.
Back to the calculation of the index position, through the hash of n-1, the significant numbers are all 1, and the difference is only how many 1s, such as 64-1 = 63 = 11 1111. After the hash value and 11 1111 and operation, the index is the average distribution value in the N-1 interval, which is similar to remainder.
A little forgotten, you can see the following hash simulation
hash simulation:
ConcurrentHashMap
sketch:
java1. In version 5, Doug Li, one of the developers of HashMap, submitted a concurrent package, which contains threads and some concurrent tool classes. Including gawa Tools Simultaneous The simultaneous HashMap (Java. Util. Concurrent. Concurrent HashMap) realizes thread safety and becomes a HashMap that can be used safely in multithreading concurrency.
Key value pairs, unordered, thread safe
It is a collection of thread safety commonly used in multithreading.
put process:
public V put(K key, V value) { return putVal(key, value, false); } final V putVal(K key, V value, boolean onlyIfAbsent) { if (key == null || value == null) throw new NullPointerException(); int hash = spread(key.hashCode()); int binCount = 0; for (Node<K,V>[] tab = table;;) { Node<K,V> f; int n, i, fh; if (tab == null || (n = tab.length) == 0) tab = initTable(); else if ((f = tabAt(tab, i = (n - 1) & hash)) == null) { if (casTabAt(tab, i, null, new Node<K,V>(hash, key, value, null))) break; // no lock when adding to empty bin } else if ((fh = f.hash) == MOVED) tab = helpTransfer(tab, f); else { V oldVal = null; synchronized (f) { if (tabAt(tab, i) == f) { if (fh >= 0) { binCount = 1; for (Node<K,V> e = f;; ++binCount) { K ek; if (e.hash == hash && ((ek = e.key) == key || (ek != null && key.equals(ek)))) { oldVal = e.val; if (!onlyIfAbsent) e.val = value; break; } Node<K,V> pred = e; if ((e = e.next) == null) { pred.next = new Node<K,V>(hash, key, value, null); break; } } } else if (f instanceof TreeBin) { Node<K,V> p; binCount = 2; if ((p = ((TreeBin<K,V>)f).putTreeVal(hash, key, value)) != null) { oldVal = p.val; if (!onlyIfAbsent) p.val = value; } } } } if (binCount != 0) { if (binCount >= TREEIFY_THRESHOLD) treeifyBin(tab, i); if (oldVal != null) return oldVal; break; } } } addCount(1L, binCount); return null; }
In put flow chart:
Capacity expansion process:
private final void transfer(Node<K,V>[] tab, Node<K,V>[] nextTab) { int n = tab.length, stride; if ((stride = (NCPU > 1) ? (n >>> 3) / NCPU : n) < MIN_TRANSFER_STRIDE) stride = MIN_TRANSFER_STRIDE; // subdivide range if (nextTab == null) { // initiating try { @SuppressWarnings("unchecked") Node<K,V>[] nt = (Node<K,V>[])new Node<?,?>[n << 1]; nextTab = nt; } catch (Throwable ex) { // try to cope with OOME sizeCtl = Integer.MAX_VALUE; return; } nextTable = nextTab; transferIndex = n; } int nextn = nextTab.length; ForwardingNode<K,V> fwd = new ForwardingNode<K,V>(nextTab); boolean advance = true; boolean finishing = false; // to ensure sweep before committing nextTab for (int i = 0, bound = 0;;) { Node<K,V> f; int fh; while (advance) { int nextIndex, nextBound; if (--i >= bound || finishing) advance = false; else if ((nextIndex = transferIndex) <= 0) { i = -1; advance = false; } else if (U.compareAndSwapInt (this, TRANSFERINDEX, nextIndex, nextBound = (nextIndex > stride ? nextIndex - stride : 0))) { bound = nextBound; i = nextIndex - 1; advance = false; } } if (i < 0 || i >= n || i + n >= nextn) { int sc; if (finishing) { nextTable = null; table = nextTab; sizeCtl = (n << 1) - (n >>> 1); return; } if (U.compareAndSwapInt(this, SIZECTL, sc = sizeCtl, sc - 1)) { if ((sc - 2) != resizeStamp(n) << RESIZE_STAMP_SHIFT) return; finishing = advance = true; i = n; // recheck before commit } } else if ((f = tabAt(tab, i)) == null) advance = casTabAt(tab, i, null, fwd); else if ((fh = f.hash) == MOVED) advance = true; // already processed else { synchronized (f) { if (tabAt(tab, i) == f) { Node<K,V> ln, hn; if (fh >= 0) { int runBit = fh & n; Node<K,V> lastRun = f; for (Node<K,V> p = f.next; p != null; p = p.next) { int b = p.hash & n; if (b != runBit) { runBit = b; lastRun = p; } } if (runBit == 0) { ln = lastRun; hn = null; } else { hn = lastRun; ln = null; } for (Node<K,V> p = f; p != lastRun; p = p.next) { int ph = p.hash; K pk = p.key; V pv = p.val; if ((ph & n) == 0) ln = new Node<K,V>(ph, pk, pv, ln); else hn = new Node<K,V>(ph, pk, pv, hn); } setTabAt(nextTab, i, ln); setTabAt(nextTab, i + n, hn); setTabAt(tab, i, fwd); advance = true; } else if (f instanceof TreeBin) { TreeBin<K,V> t = (TreeBin<K,V>)f; TreeNode<K,V> lo = null, loTail = null; TreeNode<K,V> hi = null, hiTail = null; int lc = 0, hc = 0; for (Node<K,V> e = t.first; e != null; e = e.next) { int h = e.hash; TreeNode<K,V> p = new TreeNode<K,V> (h, e.key, e.val, null, null); if ((h & n) == 0) { if ((p.prev = loTail) == null) lo = p; else loTail.next = p; loTail = p; ++lc; } else { if ((p.prev = hiTail) == null) hi = p; else hiTail.next = p; hiTail = p; ++hc; } } ln = (lc <= UNTREEIFY_THRESHOLD) ? untreeify(lo) : (hc != 0) ? new TreeBin<K,V>(lo) : t; hn = (hc <= UNTREEIFY_THRESHOLD) ? untreeify(hi) : (lc != 0) ? new TreeBin<K,V>(hi) : t; setTabAt(nextTab, i, ln); setTabAt(nextTab, i + n, hn); setTabAt(tab, i, fwd); advance = true; } } } } } }
. . . . .
The expansion process is too long, and there are online blogs everywhere. There are many clear and simple articles, so I'm lazy.
TreeMap
LinkedHashMap
HashTable
List
ArrayList
LinkedList
Vector
CopyOnWriteArrayList
set
HashSet
TreeSet
LinkedHashSet
Summary:
There are not many other collections to describe. It is recommended to read the HashMap source code first, and then the concurrentHashMap source code. The process is: first look at it roughly, and then look at it carefully. If you don't understand it, find analysis and comparison on the Internet, and draw a flow chart after reading it or write it for consolidation. After reading the source code of these two sets, you will soon get started with the source code of other sets.
Sets are divided into three categories: Map, List and Set. The big three categories are subdivided into hash, Linked and Tree. In essence, they are the combination of various underlying sets. It's good to understand arrays, Linked lists and binary trees, or through these sets.
At the beginning, I said to sort out the big VAT on the Internet ~ the following articles are better articles that I have read, which are recommended to vegetable and chicken friends. Please ignore them.
hashMap
Source code analysis https://blog.csdn.net/ywlmsm1224811/article/details/91388815 Speak clearly and in detail. You can have a look at what you haven't seen
Interview questions https://blog.csdn.net/v123411739/article/details/106324537 I am very good at this. My writing is very funny. There are other articles worth reading
concurrentHashMap
Source code analysis https://blog.csdn.net/xingxiupaioxue/article/details/88062163 Good overall
Expansion part https://blog.csdn.net/ZOKEKAI/article/details/90051567 The picture of the expansion process is very good. Because of his picture, I don't want to lose face and continue to draw
PS: let's finish rough first. We'll add details when we have time~
I want me to study secretly and amaze all interviewers!