## 1, Bubble sorting

Bubble sorting is to compare two adjacent numbers. If the order is wrong, the two numbers will be exchanged. The right position after each round of traversal is the maximum value.

public void bubbleSort(int[] arr){ for (int i = 0; i <arr.length ; i++) { //After traversing the elements of r-length, the first round of comparison is not used //Reduce the number of program runs for (int j = 0; j < arr.length-i-1; j++) { if(arr[j]>arr[j+1]) swap(arr,j+1,j); } } }

Two cycles are carried out. The first cycle is carried out n times, and the second cycle is approximately n times, so the time complexity is O(n^2);

Because the two numbers can not be exchanged when they are equal, the relative position of the equal numbers can be guaranteed to remain unchanged, so the sorting algorithm is stable.

### 1.2 improvement of bubble sorting

Optimization 1: since the entire array has been ordered during the last loop of the outermost layer, the outermost loop can be reduced once

Optimization 2: we can set a flag. When an array itself is in an ordered state or has been ordered after several outer loops, we don't need to continue the comparison and return directly

public void bubbleSortImprove(int[] arr){ boolean flag=true; for (int i = 0; i <arr.length-1 ; i++) { flag=true; for (int j = 0; j < arr.length-i-1; j++) { if(arr[j]>arr[j+1]){ swap(arr,j+1,j); //If it is exchanged, it indicates that it is still disordered flag=false; } } //If the flag does not change, it indicates that there is no exchange if(flag) return; } }

## 2, Select sort

Selective sorting is to cycle through the array, selecting the smallest number at each time and putting it at the leftmost end

public void selectSort(int[] arr){ int min=0; for (int i = 0; i < arr.length-1; i++) { min=i; for (int j = i+1; j < arr.length; j++) { if(arr[j]<arr[min]) min=j; } swap(arr,i,min); } }

The time complexity of this sorting algorithm is also O(n^2);

Selective sorting is also an unstable sorting algorithm, such as sorting A80, B80 and C70. After the first step, it will become C70, B80 and A80. When sorting again, it will be the same order. If you want to maintain stability, you have to increase the number of moves to further reduce the performance

### 2.2 selection and sorting improvement

Each time we choose a minimum number on the far left, or we can choose a maximum number on the far right at the same time. The two sides are close to the middle, which reduces the number of executions.

public void selectSortImprove(int[] arr){ int min=0; int max=0; int left=0; int right=arr.length-1; while (left<right){ min=left; max=left; for (int i = left+1; i <=right ; i++) { if(arr[i]<arr[min]) min=i; if(arr[i]>arr[max]) max=i; } swap(arr,min,left); if(max==left) max=min; swap(arr,right,max); left++; right--; } }

## 3, Insert sort

Insertion sorting is to insert the elements of the unordered interval into the ordered interval in turn. If the length of an array is n, then the initial ordered interval is [0,1], and the unordered interval is [1,n-1)

Insert sorting is very efficient when sorting relatively orderly data!

public void insertSort(int[] arr){ //Ordered interval is [0,i) for(int i=1;i<arr.length;i++){ //The unordered interval is [i,arr.length-1] //When the first element of the unordered interval is less than the last element of the ordered interval, the two elements are exchanged for (int j = i; j >0 && arr[j]<arr[j-1]; j--) { swap(arr,j,j-1); } } }

When only the last element is ordered, the time complexity is O(n), but this is rare, which is basically between O(n)-O(n^2);

Because they move forward one by one, when they are equal, they do not exchange, and the relative position of the same number will not be changed, so the sorting is stable.

### 3.2 improvement of insertion sorting

The process of moving the elements forward in turn, which we wrote above, is actually to find out where the element should be placed. When moving elements again, the number of times of moving elements cannot change, but we can use binary search to improve the search efficiency.

public void insertSortImprove(int[] arr){ int left=0; int right=0; int mid=0; for (int i = 1; i < arr.length; i++) { left=0; right=i; int val=arr[i]; while (left < right) { mid=left+((right-left)>>1); if (arr[i] >= arr[mid]) { left=mid+1; }else { //You can't get right, so don't use - 1 right=mid; } } for (int j = i; j >left; j--) { arr[j]=arr[j-1]; } arr[left]=val; } }

## 4, Hill sort

Hill sort is essentially a sort of grouping insertion sort. First, take an integer d1 less than n as the first increment to group all the records of the file. All records whose distance is a multiple of d1 are placed in the same group. First, perform direct insertion sorting in each group; Then take the second increment until the array sorting is completed. The increment generally selects the array length n to divide by 2 or 3 in turn

public void shellSort(int[] arr){ int gap=arr.length>>1; while (gap > 1) { insertSortGap(arr,gap); gap/=2; } insertSortGap(arr,1); } private void insertSortGap(int[] arr, int gap) { for (int i = gap; i <arr.length; i++) { for (int j = i; j-gap>=0 && arr[j]<arr[j-gap] ; j=j-gap) { swap(arr,j,j-gap); } } }

The average time complexity of hill sorting is O(n^1.3), and the worst case is O(n ^2)

Because multiple insertion sorting will change the relative order of the same elements, the sorting algorithm is unstable.

## 5, Heap sort

The idea of heap sorting is to first establish a maximum heap, and then take out the maximum element (that is, the first element of the array) to exchange with the last element. At this time, the last element is orderly. After adjusting the heap, exchange the maximum element with the last element of the unordered interval, and repeat this operation until the array is all orderly.

public void heapSort(int[] arr){ for (int i = (arr.length-1-1)/2; i >=0 ; i--) { shiftDown(arr,i,arr.length-1); } for (int i = arr.length-1; i >0; i--) { swap(arr,0,i); shiftDown(arr,0,i); } } //Adjust the heap private void shiftDown(int[] arr,int i,int n){ while (i * 2 + 1 < n) { int j=i*2+1; if(j+1<n&&arr[j]<arr[j+1]){ j++; } if(arr[i]<arr[j]){ swap(arr,i,j); } i=j; } }

Because it needs to be exchanged n times, the time complexity of adjusting the heap is O(logn), so the time complexity of heap sorting is O(nlogn);

The sorting algorithm is unstable because the relative order of the same number may change when constantly adjusting the heap.

## 6, Merge sort

Merge sort is a classic application of divide and conquer method. This algorithm continuously divides the elements into two parts, and then combines them to sort.

private void mergeSortInternal(int[] arr,int l,int r){ if(r-l<1) return; int mid=l+((r-l)>>1); mergeSortInternal(arr,l,mid); mergeSortInternal(arr,mid+1,r); merge(arr,l,mid,r); } private void merge(int[] arr, int i,int mid, int r) { int[] nums=new int[r-i+1]; for (int j = i; j <=r ; j++) { nums[j-i]=arr[i]; } int m=i; int n=mid+1; for (int j = i; j <= r; j++) { if (j > m) { arr[j]=nums[n-j]; n++; } else if (j > n) { arr[j]=nums[m-j]; }else if(nums[m-i]<nums[n-i]){ arr[j]=nums[m-i]; m++; }else { arr[j]=nums[n-i]; n++; } } }

The time complexity of merging, sorting and traversing the whole large array is O(n), and the array is continuously split. From the depth of the binary tree, the time complexity is O(logn), and the total time complexity is O(nlogn). (however, due to the additional array created, the space complexity is O(n))

In the merge operation, when the two elements are the same, the left value can be placed first, which ensures the relative position, so the sorting is stable.

### 6.1 optimization of merging and sorting

Optimization 1: when the number of elements is less than 15, insert sort can be used for sorting

Optimization 2: merge operation is required only when the first element on the left is greater than the first element on the right

public void mergeSort(int[] arr){ mergeSortInternal(arr,0,arr.length-1); } private void mergeSortInternal(int[] arr,int l,int r){ if(r-l<15){ insertSortSetion(arr,l,r); return; } int mid=l+((r-l)>>1); mergeSortInternal(arr,l,mid); mergeSortInternal(arr,mid+1,r); if(arr[l]>arr[r]) merge(arr,l,mid,r); } private void insertSortSetion(int[] arr, int l, int r) { for (int i = l+1; i <=r ; i++) { for (int j = i; j >l && arr[j]<arr[j-1] ; j--) { swap(arr,j,j-1); } } } private void merge(int[] arr, int i,int mid, int r) { int[] nums=new int[r-i+1]; for (int j = i; j <=r ; j++) { nums[j-i]=arr[i]; } int m=i; int n=mid+1; for (int j = i; j <= r; j++) { if (j > m) { arr[j]=nums[n-j]; n++; } else if (j > n) { arr[j]=nums[m-j]; }else if(nums[m-i]<nums[n-i]){ arr[j]=nums[m-i]; m++; }else { arr[j]=nums[n-i]; n++; } } }

### 6.2 non recursive writing of merge sort

public void mergeSortNoRecursion(int[] arr){ for (int i = 1; i <=arr.length ; i*=2) { for (int j = 0; j+i < arr.length; j=j+2*i) { //The interval on the right may exceed the length of the array, so you need to take the minimum value merge(arr,j,j+i-1,Math.min(arr.length-1,j+2*i-1)); } } }

## 7, Quick sort

Quick sorting is to select a benchmark value V, and then put all values less than V on the left of V and all values greater than V on the right. Sorting is finally realized through the integration of partitions.

public void quickSort(int[] arr){ quickSortInternal(arr,0,arr.length-1); } private void quickSortInternal(int[] arr, int l, int r) { if(r-l<15){ insertSortSetion(arr,l,r); return; } int p=partation(arr,l,r); quickSortInternal(arr,l,p-1); quickSortInternal(arr,p+1,r); } private int partation(int[] arr, int l, int r) { int v=arr[l]; int j=l; for (int i = l+1; i <=r; i++) { if(arr[i]<v){ swap(arr,i,j+1); j++; } } swap(arr,l,j); return j; }

The time complexity of fast scheduling is O(nlogn);

Because the relative position of elements may be disordered during partition, it is unstable.

### 7.1 quick sort optimization

Method 1: since the first one is selected by default when selecting the benchmark value, it may lead to zoning chaining, so the random benchmark value can be selected when selecting.

Method 2: when the sorting interval is less than 15, using selective sorting can improve the performance.

However, in extreme cases, that is, all elements are the same or a large number of elements are the same, the time complexity of fast scheduling will degenerate into O(N^2). At this time, two-way fast scheduling or three-way fast scheduling need to be used

### 7.2 two way quick sort

On the basis of quick sorting, two-way fast sorting is to scan from the left and right ends of the array to the middle, so that the number less than or equal to V is placed on the left of the array and the number greater than or equal to the array is placed on the right of the array, so as to realize the uniform distribution of the same number. When there are a large number of elements, only the left end or the right end will be sorted.

public void quickSort2(int[] arr){ quickSortInternal(arr,0,arr.length-1); } private void quickSortInternal2(int[] arr, int l, int r) { if(r-l<15){ insertSortSetion(arr,l,r); return; } int p=partation(arr,l,r); quickSortInternal(arr,l,p-1); quickSortInternal(arr,p+1,r); } private int partation2(int[] arr, int l, int r) { int v=arr[l]; int j=r; int i=l+1; while (true) { while (arr[i]<=v){ i++; } while (arr[j] >= v) { j++; } if (i > j) { break; } swap(arr,i,j); } swap(arr,l,j); return j; }

### 7.3 three way fast platoon

Three way fast row is to fix the equal element interval in the middle at one time. It only needs to recursively row the elements less than V on the left and elements greater than V on the right, so as to further improve the efficiency.

public void quickSort3(int[] arr){ quickSortInternal(arr,0,arr.length-1); } private void quickSortInternal3(int[] arr, int l, int r) { if(r-l<15){ insertSortSetion(arr,l,r); return; } int v=arr[l]; int i=l; int mid=l+1; int j=r+1; while (mid < j) { if (arr[i] < v) { swap(arr,mid+1,i); i++; mid++; } else if (arr[i] > v) { swap(arr,j-1,i); j--; } } quickSortInternal3(arr,l,i); quickSortInternal3(arr,mid,r); }