# divide and rule

# 1, Find min max element

The divide and conquer method can better improve the performance of the algorithm than directly looking for the minimum and maximum elements.

Algorithm idea:

Divide the array into two parts, find its minimum and maximum elements in each half, and then compare the minimum and maximum elements in each half to find the final minimum and maximum element.

The pseudo code of the algorithm is as follows:

Input:A[1...n] Output:minimum x, maximum y minmax(1,n) def minmax(low, high): if low - high == 1: if low < high: return A[low], A[high] else: return A[high], A[low] else: x1, y1 = minmax(low, (low+high)/2) x2, y2 = minmax((low+high)/2, high) x = min(x1,x2) y = max(y1,y2) return x,y

Let C(n) represent the number of element comparisons, where n is a power of 2. When n=2, C(n)=1; When n is greater than 2, C(n)=2*C(n/2)+2; Therefore, the number of element comparison is C(n)=(3n/2)-2.

# 2, Finding the middle and the k th smallest element

Algorithm idea:

(1) When the scale is less than a certain threshold, the sorting algorithm is directly used to return the results.

(2) When n is greater than the threshold, n elements are divided into | n/5 |. If n is not a multiple of 5, the remaining elements are excluded (it will not affect, here is only to find the median mm), sorted separately, and then picked out the median value of each group of elements. Then, among all the median values, this algorithm is called recursively to pick out the median value mm, which is the median of the median.

(3) All elements are divided into three groups: A1, A2 and A3, including elements less than, equal to and greater than mm respectively.

(4) In three cases, find out which of the three arrays the k-th small element is in:

a. If the number of elements in A1 is greater than or equal to K, that is, the k-th element is in the first group: recursively find the k-th element in A1.

b. If the sum of elements A1 and A2 is greater than or equal to K, the middle term mm is the k-th element: mm is returned.

c. Otherwise, the K-th element is in the third group: recursively find the small element (the sum of the number of k-A1 and A2 elements) in A3.

The pseudo code of the algorithm is as follows:

Insert the code slice here

The algorithm analysis is as follows:

XXXXX

# 3, Quick sort (Quicksort)

Firstly, the sprint algorithm is introduced. It can rearrange the array into the form that the front elements are smaller than the middle elements and the rear elements are larger than the middle elements. The middle element is the first element of the array when it is not sorted.

The pseudo code of the algorithm is as follows:

Version 1: (time complexity is too high)

Input:A[low...high] Output:Rearranged A,Location of intermediate elements w That is, the new position of the first element of the array when it is not sorted i = low + 1 while i <= high: if A[low] < A[i]: i++ else: j = i //Find one smaller than A[low] in the back for exchange while A[j] >= A[low] and j <= high: j++ if j <= high: exchange A[i]and A[j] i++ else: break take A[low]insert A[i+1]//Not an exchange return A and i+1

Version 2: (low time complexity)

Input:A[low...high] Output:Rearranged A,Location of intermediate elements w That is, the new position of the first element of the array when it is not sorted i = low for j = low+1 to high://When A[j] is greater than A[low], j continues to increase if A[j] < A[low]://When A[j] is less than A[low], exchange A[i] and A[j] i = i + 1 if i != j: exchange A[i]and A[j] exchange A[low]and A[i] w = i return A and w

Pseudo code diagram of version 2:

Then the quick sort algorithm is introduced:

The pseudo code of the algorithm is as follows:

Input:A[1...n] Output:Ascending order A quicksort(A[1...n]) def quicksort(A[low...high]): if low < high: w = split(A[low...high])//SPLIT first quicksort(A[low...w-1]) quicksort(A[w+1...high])

Time complexity (also the number of element comparisons):

Best case: the best case of quick sorting is that the elements obtained each time are just divided into the whole array. At this time, the time complexity formula is: T[n] = 2T[n/2] + f(n); T[n/2] is the time complexity of the sub array after bisection, and f[n] is the time spent when bisecting the array; Therefore, t [n] = 2T[n/2] + n = 2 ^ m, t [1] + Mn, where m = logn.

Therefore, T[n] = 2^(logn)T[1] + nlogn = nT[1] + nlogn = n + nlogn. When n > = 2: nlogn > = n, the following nlogn is taken;

To sum up, the time complexity of quicksort is O (nlogn).

Worst case: the worst case is that the elements obtained each time are the smallest / largest in the array. In fact, this case is bubble sorting (the order of one element is arranged every time). This case is the time complexity of bubble sorting: T[n] = n * (n-1) = n^2 + n;

To sum up, the time complexity of quicksort in the worst case is O(n^2).

Average case: the average time complexity of quick sort is also: O(nlogn).