catalogue
Bubble sorting algorithm principle:
Bubble sorting Java code implementation:
Analysis of time and space complexity and algorithm stability of bubble sorting
Select sort Java code implementation:
Analysis of time and space complexity and algorithm stability of selective sorting
Java implementation of insert sort:
Time complexity and space complexity analysis of insertion sort
Java implementation of merge sort:
Analysis of time complexity and space complexity of merging and sorting
Idea of single channel fast algorithm:
Java implementation of one-way quick sorting:
Idea of two-way quick sorting algorithm:
Java implementation of two-way quick sorting:
Three way quick sorting Java code implementation:
Quick sort time complexity space complexity
The basic sorting algorithms can be divided into bubble sorting, selection sorting, insertion sorting, merging sorting and selection sorting. This paper will also explain their central ideas and Java implementation methods from these five sorting algorithms.
Bubble sorting
Bubble sorting is probably the first sorting algorithm we came into contact with in computer courses, and it is also an entry-level sorting algorithm.
Bubble sorting is simple, but it is inefficient when the order of n is large. Therefore, this sort algorithm is rarely used in actual production. Let's look at the specific implementation idea of this algorithm:
Bubble sorting algorithm principle:
- Compare adjacent elements. If the first one is bigger than the second, exchange them.
- Do the same for each pair of adjacent elements, from the first pair at the beginning to the last pair at the end. After this step, the last element will be the maximum number.
- Repeat the above steps for all elements except the last one.
- Continue to repeat the above steps for fewer and fewer elements at a time until no pair of numbers need to be compared.
The comparison process is shown in the figure (the image is invaded and deleted from Google)
Bubble sorting Java code implementation:
/** * @param arr Array to be sorted * @param n Array length arr.length */ private static void BubbleSort(int[] arr, int n) { for (int i = 0; i < n - 1; i++) { for (int j = 1; j < n - i; j++) { if (arr[j - 1] > arr[j]) { //Swap two elements int temp = arr[j]; arr[j] = arr[j - 1]; arr[j - 1] = temp; } } } } Copy code
Analysis of time and space complexity and algorithm stability of bubble sorting
For arrays with length N, bubble sorting needs to go through n(n-1)/2 comparisons. In the worst case, that is, when the array itself is in reverse order, it needs to go through n(n-1)/2 exchanges, so its
The average time complexity of bubble sorting algorithm is O(n ²). The space complexity is O(1).
Imagine: if two adjacent elements are equal, the exchange operation will not be carried out, that is, the order of two equal elements will not change. If two equal elements are not adjacent, even if the two elements are adjacent through the previous pairwise exchange, their positions will not be exchanged in the end, so the order of the same elements will not change after sorting Change.
Therefore, bubble sort is a stable sort algorithm. Therefore, bubble sort is a stable sort. This is the definition of algorithm stability:
Stability of sorting algorithm: Generally speaking, it can ensure that the sequence of the first two equal data in the sequence is the same as that after sorting.
Bubble sort summary:
- The average time complexity of bubble sorting algorithm is O(n ²).
- The space complexity is O(1).
- Bubble sort is stable sort.
Select sort
Selective sorting is another simple sorting algorithm. The reason why selective sorting is called selective sorting is to find the angular position of the smallest element in a traversal process, and then put it at the head of the array. Our sorting process is to find the smallest element in the remaining array, so it is called selective sorting.
Idea of selective sorting
The idea of selecting sorting is also simple:
- Find the element with the smallest keyword from the sequence to be sorted; it is assumed that the first element is the smallest
- If the smallest element is not the first element of the sequence to be sorted, exchange it with the first element;
- From the remaining N - 1 elements, find the element with the smallest keyword and repeat steps 1 and 2 until the sorting is completed.
Sketch Map:
Select sort Java code implementation:
public static void sort(int[] arr) { int n = arr.length; for (int i = 0; i < n; i++) { int minIndex = i; // After the for loop i, all numbers find the index of the smallest value in the remaining array for (int j = i + 1; j < n; j++) { if (arr[j]< arr[minIndex]) { minIndex = j; } } swap(arr, i, minIndex); } } /** * Exchange elements in the form of corner markers */ private static void swap(int[] arr, int i, int j) { int temp = arr[i]; arr[i] = arr[j]; arr[j] = temp; } Copy code
Analysis of time and space complexity and algorithm stability of selective sorting
As can be seen from the above java code, we have not opened up additional space except for exchanging elements, so the additional space complexity is O(1).
In terms of time complexity, bubble sorting requires traversing n(n-1)/2 times, but compared with bubble sorting, each traversal only needs to exchange elements once, which is a certain optimization for computer execution. However, selective sorting is also a real slowness, and n (n-1) is required even for ordered arrays /Two comparisons, so the time complexity is O(n ²).
Even if n(n-1)/2 comparisons are made anyway, the selective sorting is still an unstable sorting algorithm. Let's take an example: sequence 5 8 5 29. We know that the first element 5 selected in the first pass will be exchanged with 2, and the relative order of the two 5 in the original sequence will be destroyed.
Select sort summary:
- The average time complexity of the selected sorting algorithm is O(n) ²).
- Select the sorting space complexity as O(1).
- Select sort as unstable sort.
Insert sort
For insertion sorting, most of the data are introduced by using poker sorting as an example. We touch cards one by one. If we don't touch a card, we will compare with all the cards in our hand to select the appropriate position to insert this card. This is the central idea of direct insertion sorting. Let's take a look at the following figure:
I believe you probably know the implementation idea of insertion sort after reading the dynamic diagram. Let's talk about the idea of insertion sort.
Insert the idea of sorting
- Starting with the first element, the element can be considered to have been sorted
- Take out the next element and scan from back to forward in the sorted element sequence
- If the element (sorted) is larger than the new element, move the element to the next position
- Repeat step 3 until you find a location where the sorted element is less than or equal to the new element
- After inserting the new element into this location
- Repeat steps 2 to 5
Java implementation of insert sort:
Let's first look at the most basic implementation:
public static void sort(int[] arr) { int n = arr.length; for (int i = 0; i < n; i++) { //The inner loop compares the values of i and all the preceding elements. If the value indicated by the j index is less than j- 1, the positions of the two are exchanged for(int j = i; j > 0 && arr[j-1] > arr[j]; j--){ swap(arr,j-1,j); } } } Copy code
In the above algorithm implementation, every time we find which position i should be in the array, we are at the cost of exchanging the current element and the previous element. We know that the exchange operation is more time-consuming than the assignment operation, because each exchange requires three assignment operations, Let's think that when we play poker, we don't pick up a card and move forward one by one. We know where to put it. We take out this card and insert it when we find the position (suddenly evil). In fact, we move the cards after this position back one position at a time. Can we achieve it with Java code? The answer must be yes:
public static void sort(int[] arr) { int n = arr.length; for (int i = 0; i < n; i++) { //Bring out such cards that are not currently sorted int e = arr[i]; //Find its place for(int j = i; j > 0 && arr[j-1] > arr[j]; j--){ arr[j]= arr[j-1]; } //At the end of the cycle, arr [J] > = arr [J-1], then the j subscript is where e should be. arr[j] = e; } } Copy code
Time complexity and space complexity analysis of insertion sort
For the time complexity and space complexity of insertion, it can be seen from the code that it is no different from selection and bubbling, and belongs to O(n) ²) Level time complexity algorithm, but the traversal mode has changed from n-1, n-2... 1 to 1, 2, 3... N. finally, the time complexity is n(n-1)/2.
For stability, insertion sorting, like bubbling, does not change the order between the original elements. If you encounter an element that is equal to the inserted element, put the element to be inserted after the equal element. Therefore, the order of the equal elements does not change, and the order out of the original unordered sequence is still the order after the order, so the insertion sorting is stable .
For insertion sorting, a very important point here is that this sorting algorithm is very useful because it can terminate the inner comparison in advance (arr [J-1] > arr [J]). Therefore, for some NlogN level algorithms, the merging and fast in the back belong to this level. For the algorithm, when n is less than a certain level (47 is used in Array.sort) Can be optimized by insertion algorithm. In addition, this early termination method has more advantages for almost ordered arrays.
Insert sort summary:
- The average time complexity of insertion sorting algorithm is O(n) ²).
- The insertion sort space complexity is O(1).
- Insert sort is a stable sort.
- Insert sort is more efficient for almost ordered arrays. Insert sort can be used to optimize advanced sorting algorithms
Merge sort
Next, let's look at an NlogN level sorting algorithm, the merge algorithm. The merge algorithm, just like its name, uses the merge method to sort:
We can always divide an array into two and then divide it into four until there are only two elements in each group. This can be understood as a recursive process, then sort the two elements, and then sort the two elements as a group. Until all the elements are sorted. Similarly, let's look at the following dynamic graph.
The idea of merging algorithm
In fact, the merging algorithm can be divided into recursive method and iterative method (merging from bottom to top). The two methods realize the merging operation of the minimum set with the same idea. The difference lies in how to divide the array. Let's first introduce the most basic operation of the algorithm:
- Apply for space so that its size is the sum of two sorted sequences. The space is used to store the merged sequences
- Set two pointers. The initial position is the starting position of the two sorted sequences
- Compare the elements pointed to by the two pointers, select the relatively small element, put it into the merge space, and move the pointer to the next position
- Repeat step 3 until a pointer reaches the end of the sequence
- Copy all the remaining elements of another sequence directly to the end of the merged sequence
Suppose that we are merging the arr[l...r] part of an array. According to the above merging idea, we can divide the array into two parts. Suppose that they are arr[l...mid] and arr[mid+1...r]. Note that these two parts may have different lengths, because when dividing an array with cardinal numbers, we can always get a part with length 1 and length 2 for merging
Then we write the code according to the above ideas:
Java implementation of merge sort:
/** * arr[l,mid] And arr[mid+1,r] */ private static void merge(int[] arr, int l, int mid, int r) { // Copy the waiting merge array for comparison operation, and assign each corner mark of the original arr as the correct element int[] aux = new int[r - l + 1]; for (int i = l; i <= r; i++) { aux[i - l] = arr[i]; } int i = l; int j = mid + 1; for (int k = l; k <= r; k++) { if (i > mid) { //Note that the left part has been put into the array arr[k] = aux[j - l]; j++; } else if (j > r) { //Note that the left part has been put into the array arr[k] = aux[i - l]; i++; } else if (aux[i - l] < aux[j - l]) { //When the element value of the left half array is less than the element value of the right array, it is assigned as the element value of the left arr[k] = aux[i - l]; i++; } else { //When the element value of the left half array is greater than or equal to the element value of the right array, it is assigned to the element value of the left, which also ensures the stability of sorting arr[k] = aux[j - l]; j++; } } } Copy code
I believe you have understood the merging algorithm with the dynamic graph and the above algorithm implementation just now. If you feel confused, you can try to take an array and perform the merging process on paper. I believe you can understand. The above is only the core part of the algorithm, so how should we sort the whole array? Two methods are also mentioned above , one is recursive partition method and the other is iterative traversal method (from low to top). Let's look at recursive implementation first:
/** * * @param arr Array to be sorted * @param l In fact, the element angle sign is 0 * @param r Last element subscript n -1 */ private static void mergeSort(int[] arr, int l, int r) { if (l >= r) { return; } //Start merge sort round down int mid = (l + r) / 2; //Recursive partition array mergeSort(arr, l, mid); mergeSort(arr, mid + 1, r); //Check whether the arrays merged in the previous step are in order. If they are in order, carry out the next merging directly if (arr[mid] <= arr[mid + 1]) { return; } //Merge and sort the elements on both sides merge(arr, l, mid, r); } Copy code
If you don't understand the recursive process, you can understand it in conjunction with the following figure (the picture comes from the Internet, invasion and deletion):
Of course, we merge the left half first, that is, we first go to the bottom 8 | 6 on the left of Level3, and then recurse on the right after merging. Finally, we merge 8 6 2 3 | 1 5 7 4.
For iterative implementation, merging is actually different from recursive implementation. During iteration, we divide the array into elements one by one, and then merge every two. The second time, we divide the array into groups every two, and merge two by two until we know that the grouping size is equal to the length of the array to be merged, that is, we first sort locally and gradually expand to global sorting
/** * Merge sort from bottom to top * * @param n Is the array length * @param arr array */ private static void mergeSortBU(Integer[] arr, int n) { //The outer traversal starts from the merging interval length of 1, and the space 1 2 4 8 sz needs to be traversed to the length of the array //sz = 1 : [0] [1]... //sz = 2 : [0,1] [2.3] ... //sz = 4 : [0..3] [4...7] ... for (int sz = 1; sz <= n; sz += sz) { //The inner layer traversal needs to compare the size of the two intervals arr[i,i+sz-1] arr[i+sz,i+sz+sz-1], that is, merge the sz - 1 array space each time // Note that i increments the length of two SZS each time, because the arrays of two sz length parts have been merged each time i merge for (int i = 0; i + sz < n; i += sz + sz) { merge(arr, i, i + sz - 1, Math.min(i + sz + sz - 1, n - 1)); } } } Copy code
For example, we see that the first time is the merging of sz = 1 length, that is, the merging of elements with i = 0 and i = 1. The next merging should be i = 2 and i = 3, and so on. Therefore, the inner loop i should be incremented by two sz at a time. In order to avoid the corner mark crossing the boundary and ensure the existence of the right half of the merging, i + sz < n, and considering that the array length is odd, So the right side of the right half is Math.min(i + sz + sz - 1, n - 1); Refer to the picture below:
Analysis of time complexity and space complexity of merging and sorting
In fact, for the time complexity of merge sort, there is a recursive formula to infer the time complexity, but simply speaking, assuming that the array length is N, we will have logN times to divide the interval, and eventually divide it into constant level merge. Add up the merge time of all layers to get an NlogN. Students who want to understand the time complexity of merge sort can turn left Merging sort and its time complexity analysis , there will be no more explanation here.
For the spatial complexity, we can see through the algorithm implementation that our merging process applies for a temporary array with length N for merging, so the spatial complexity is O(n);
In addition, because we do not exchange positions for aux[i - l] = aux[j - l] in the sorting process, we directly obtain the first assignment of the top elements, so the algorithm is stable.
**Merge sort summary:**
- The average time complexity of merging sorting algorithm is O(nlog(n)).
- The complexity of merge sort space is O(n).
- Merge sort is stable sort.
- about
Quick sort
Quick sort is the most widely used sorting algorithm, which is famous for its fast two words. Like merge sort, quick sort adopts the idea of partition and rule. The basic idea of divide and conquer method is to decompose the original problem into several sub problems with smaller scale but similar structure to the original problem. Solve these subproblems recursively, and then combine the solutions of these subproblems into the solutions of the original problem. We only need to pay attention to how to solve the minimum problem and how to recurse, so that we can get the correct algorithm implementation. Quick sort can be divided into single quick sort, double quick sort and three quick sort. The difference is that they select several pointers to traverse the array. We will explain them in turn below.
Idea of single channel fast algorithm:
First, we select a number in the array and put it in the appropriate position. The numbers on the left of this position are all less than this value, and the numbers on the right of this position are all greater than this value.
-
Assume that the array is arr[l...r], assume that the specified value is the first element of the array int v = arr[l], and assume that j is marked as the last element smaller than V, that is, arr [J + 1] > v. If the element under investigation is i, there is arr [L + 1... J] < V, arr [J + 1, i) > = V, as shown in the above figure.
-
Assuming that the value of the element under investigation is e, E > = V, we just need to hand it in and directly i + + to investigate the next element,
-
When e < V is assumed from the above, we need to put e in the part of < v. at this time, we just need to exchange the positions of arr[j] and arr[i].
-
After the last element investigation is completed, we can change the positions of arr[l] and arr[j].
-
After the above traversal is completed, arr [L + 1... J] < V, arr [J + 1, I) > = V is satisfied. Next, we only need to recursively investigate arr [L + 1... J] and arr[j+1,r].
Java implementation of one-way quick sorting:
private static void quickSort(int[] arr, int l, int r) { if (l >= r) { return; } // p is the position where v should be after the first sorting, that is, the dividing point of divide and conquer int p = partition(arr, l, r); quickSort(arr, l, p - 1); quickSort(arr, p + 1, r); } private static int partition(Integer[] arr, int l, int r) { // In order to improve efficiency and reduce the probability of uneven recursive tree causing fast sorting, // For an array, the probability that the number of randomly selected elements each time is the minimum and maximum elements in the current partition operation is 1/n int randomNum = (int) (Math.random() * (r - l + 1) + l); swap(arr, l, randomNum); int v = arr[l]; int j = l; for (int i = l + 1; i <= r; i++) { if (arr[i] < v) { swap(arr, j + 1, i); j++; } } swap(arr, l, j); return j; } private static void swap( int[] arr, int i, int j) { int temp = arr[i]; arr[i] = arr[j]; arr[j] = temp; } Copy code
Why is a random element in the current sorted array selected for comparison in the above algorithm? Assuming that the array we are investigating is already sorted, our recursive tree will extend the depth of n to the right. This situation makes us not want to see. If we randomly take a number from the array every partition, then this number is the current number If the probability of the smallest element in the sorting array is 1/n, the probability of getting the smallest number every time is very low.
Idea of two-way quick sorting algorithm:
-
Like single channel, double channel quick sorting also selects the first element of the array as the flag bit (after random selection)
-
Two way quick sorting requires two pointers. The pointers I and J point to the positions of l+1 and r respectively, and then both traverse the middle of the array at the same time. In the traversal process, we should ensure that arr [l+1... I) < = V, arr (J.. r] > = v. therefore, we can initialize i = l+1 to ensure that the left interval is initially empty, and j = r to ensure that the right space is empty
-
When I < = R and arr [i] < = v in the traversal process, I + + is OK. When arr [i] > v, it indicates that the value of I is greater than the value of v. at this moment, it can wait for the value of j corner mark and traverse the array from right to left. When arr [i] < v indicates that it encounters an element with the value of j less than v, it should not stay in this position,
-
After obtaining the angle sign of i and j, it is necessary to judge whether it is the end of the cycle, that is, whether i is greater than J.
-
Otherwise, you should exchange the position between the element at position i and the element at position j, and then i++ j -- continue the loop
-
The condition for the end of traversal is I > j. at this time, arr[j] is the last element less than V and arr[i] is the first element greater than v. therefore, the position of j should be the position of V in the array. Therefore, it is necessary to exchange arr[l] and arr[j] after traversal
Java implementation of two-way quick sorting:
private static void quickSort(int[] arr, int l, int r) { if (l >= r) { return; } // Here p is the last element less than v and = the first element of v int p = partition(arr, l, r); quickSort(arr, l, p - 1); quickSort(arr, p + 1, r); } private static int partition(int[] arr, int l, int r) { // In order to improve efficiency and reduce the probability of uneven recursive tree causing fast sorting, // For an array, the probability that the number of randomly selected elements each time is the smallest and largest element in the current partition operation is reduced int randomNum = (int) (Math.random() * (r - l + 1) + l); swap(arr, l, randomNum); int v = arr[l]; int i = l + 1; int j = r; while (true) { while (i <= r && arr[i] <= v) i++; while (j >= l + 1 && arr[j] >= v) j--; if (i > j) break; swap(arr, i, j); i++; j--; } //The last corner mark of j stays at I > j, which is the last element position smaller than v swap(arr, l, j); return j; } Copy code
Two way quick sort is the most frequently used quick sort implementation. The internal principle of sorting basic data types in java Arrays.sort() Collections.sort() is realized through this quick sort
Three way quick sort
In the above two algorithms, we find that redundant exchange processing is always done for the value processing that is the same as the flag bit. If we can divide the array into three parts > = < the efficiency may be improved, as shown in the following figure:
-
We divide the array into three parts: arr [L + 1... lt] < V arr [lt + 1.. i) = V arr [gt... R] > V, where lt points to the previous element of the last element of < V, gt points to the previous element of the first element of > V, and i is the current element
-
When defining the initial value, you can still ensure that the three parts are empty at the initial time. int lt = l; int gt = r + 1; int i = l + 1;
-
When E > V, we need to exchange positions between arr[i] and arr[gt-1], and expand the part > V by an element, namely gt -- but at this time, the I pointer does not need to be operated, because the replaced number has not been investigated.
-
When e = v, i + + continues to investigate the next one
-
When e < V, we need to exchange arr[i] with arr[lt+1]
-
When the loop ends, lt is at the last element position less than v, so finally we need to exchange the positions of arr[l] and arr[lt].
-
Finally, the correct results can be obtained by recursively sorting arr[l...lt-1] and arr[gt...r].
As shown in Figure 2 below
Three way quick sorting Java code implementation:
private static void quickSort3(int[] num, int length) { quickSort(num, 0, length - 1); } private static void quickSort(int[] arr, int l, int r) { if (l >= r) { return; } // In order to improve efficiency and reduce the probability of uneven recursive tree causing fast sorting, // For an array, the probability that the number of randomly selected elements each time is the smallest and largest element in the current partition operation is reduced by 1/n! int randomNum = (int) (Math.random() * (r - l + 1) + l); swap(arr, l, randomNum); int v = arr[l]; // Three way quick sorting is to divide the array into three parts greater than or less than or equal to //Arr [L + 1... LT] < V arr [LT + 1.. I) = V arr [GT... R] > V three parts // When defining the initial value, you can still ensure that these three parts are empty int lt = l; int gt = r + 1; int i = l + 1; while (i < gt) { if (arr[i] < v) { swap(arr, i, lt + 1); i++; lt++; } else if (arr[i] == v) { i++; } else { swap(arr, i, gt - 1); gt--; //i + + note that i does not need to add 1 here, because the value of i is still not equal to v after this exchange, which may be less than or equal to v, so the angle sign of i remains unchanged after the exchange is completed } } //At the end of the loop, the last element i where lt is < V must coincide with gt //However, the final position of V is not the position indicated by i, because i is the first element V greater than v //The position where v should be is lt, not the position where i-1 is (arr[i-1] = arr[l]) swap(arr, l, lt); quickSort(arr,l,lt-1); quickSort(arr,gt,r); } Copy code
Quick sort time complexity space complexity
Because we most often use two-way fast scheduling, we use this to analyze: for convenience of analysis, we assume that the element is not randomly selected, but the first element of the array. When the selected standard element and partition get position exchange, it is likely to disrupt the stability of the front element,
For example, the sequence is 5 3 4 3 8 9 10 11
Now the exchange of benchmark elements 5 and 3 (the fifth element, the subscript starts from 1) will disrupt the stability of element 3. Therefore, quick sorting is an unstable sorting algorithm, which occurs when the benchmark element and a[partition] are exchanged.
The time of quick sort depends on the depth of recursion. If the recursion depth depends on the value of each key value, so in the best case, the middle value of the array is taken every time, then the optimal time complexity of the algorithm is O(nlogn) Of course, the worst-case scenario is the ordered array we analyzed before, so n comparisons are required each time, and the time complexity is O(n) ²), However, in the average case, the time complexity is O(nlogn). Similarly, if you want to see the detailed push, I recommend a link here Quicksort best, worst, average complexity analysis
The spatial complexity of quick sort mainly depends on the temporary space when it is expressed as selection, so it is linked to the time complexity, so the average spatial complexity is also O(nlogn).
summary
This paper summarizes the implementation of common sorting algorithms. By studying the ideas of these algorithms, it is also helpful to the problem-solving ideas of algorithm problems. We need to master these algorithms, but Android usually does not contact too much data processing, so we need to review them often. Most of the pictures in this paper come from the Internet. If there are problems If you have any technical problems, please contact me.
Author: like a dog
Link: https://juejin.cn/post/6844903568273571853
Source: rare earth Nuggets
The copyright belongs to the author. For commercial reprint, please contact the author for authorization. For non-commercial reprint, please indicate the source.
Reference link: Several common sorting algorithms Stability and time complexity analysis of common sorting algorithms (turn, with changes) Muke.com Bobo's data structure course