leetcode lecture on algorithm interview for large factories 2. Time and space complexity

leetcode lecture on algorithm interview for large factories 2. Time and space complexity

Video tutorial (efficient learning): Click to learn

catalog:

1. Introduction

2. Time and space complexity

3. Dynamic planning

4. Greed

5. Binary search

6. Depth first & breadth first

7. Double pointer

8. Sliding window

9. Bit operation

10. Recursion & divide and conquer

11 Pruning & backtracking

12. Reactor

13. Monotone stack

14. Sorting algorithm

15. Linked list

16.set&map

17. Stack

18. Queue

19. Array

20. String

21. Trees

22. Dictionary tree

23. Consolidation

24. Other types of questions

What time complexity

Time complexity is a function that qualitatively describes the running time of the algorithm. In software development, time complexity is used to facilitate developers to estimate the running time of the program. Usually, the number of operation units of the algorithm is used to represent the time consumed by the program. By default, the running time consumed by each unit of the CPU is the same. Assuming that the problem scale of the algorithm is n, the number of operation units is expressed by the function f(n). With the increase of data scale n, the growth rate of algorithm execution time and the growth rate of f(n) show a certain relationship, which is called the asymptotic time complexity of the algorithm, referred to as time complexity, recorded as O(f(n)), where n refers to the number of instruction sets.

What is big O

Large O is used to represent the upper bound of the algorithm execution time. It can also be understood as the running time in the worst case. The amount and sequence of data have a great impact on the algorithm execution time. Here, it is assumed that the running time of an input data using the algorithm is longer than that of other data.

We all say that the time complexity of insertion sort is O(n^2), but the time complexity of insertion sort has a great relationship with the input data. If the input data is completely ordered, the time complexity of insertion sort is O(n). If the input data is completely inverted, the time complexity is O(n^2), so the worst is the time complexity of O(n^2), We say that the time complexity of insertion sorting is O(n^2).

Quicksort is O(nlogn). The time complexity of quicksort is O(n^2) in the worst case and O(nlogn) in general. Therefore, strictly speaking from the definition of large o, the time complexity of quicksort should be O(n^2). However, we still say that the time complexity of quicksort is O(nlogn), which is the default in the industry.

The time complexity of binary search is O(logn). Each binary data scale is halved until the data scale is reduced to 1. Finally, it is equivalent to finding the power of 2 equal to n, which is equivalent to dividing logn times.

The time complexity of merging sorting is O(nlogn), the top-down merging is divided from data scale n to 1, the time complexity is O(logn), and then the time complexity of continuous upward merging is O(n), and the overall time complexity is O(nlogn).

The traversal complexity of the tree is generally O(n), n is the number of nodes of the tree, and the selection sorting time complexity is O(n^2). We will gradually analyze the complexity of each data structure and algorithm in the corresponding chapters. For more time complexity analysis and derivation, please refer to the main theorem.

Some rules for analyzing complexity

  • Add multiple time complexities. If they are all related to N, take the one with high complexity, for example: O(nlogn + n) = O(nlogn), O(nlogn + n^2) = O(n^2).
  • Add multiple time complexity. If the complexity of some items is not related to n, you cannot ignore any items, such as O(AlogA + B), O(AlogA + B^2)
  • If two loops are executed in turn, the one with high complexity is taken, and if multiple loops are nested, the complexity needs to be multiplied.

Common time complexity:

  • O(1): constant complexity
 let n = 100;
  • O(logn): logarithmic complexity
 //Binary search non recursive
var search = function (nums, target) {
 let left = 0,
    right = nums.length - 1;
 while (left <= right) {
 let mid = Math.floor((left + right) / 2);
 if (nums[mid] === target) {
 return mid;
    } else if (target < nums[mid]) {
      right = mid - 1;
    } else {
      left = mid + 1;
    }
  }
 return -1;
};
  • O(n): linear time complexity
 for (let i = 1; i <= n; i++) {
 console.log(i);
}
  • O(n^2): Square
for (let i = 1; i <= n; i++) {
 for (let j = 1; j <= n; j++) {
 console.log(i);
  }
}

for (let i = 1; i <= n; i++) {
 for (let j = 1; j <= 30; j++) { //The second nested layer is not O(n^2) if it has nothing to do with n
 console.log(i);
  }
}
  • O(2^n): exponential complexity
for (let i = 1; i <= Math.pow(2, n); i++) {
 console.log(i);
}
  • O(n!): factorial
for (let i = 1; i <= factorial(n); i++) {
 console.log(i);
}

Time complexity of basic operations of common data structures

Recursive time complexity

The time complexity of recursion is related to the depth of recursion

//Recursive n-layer time complexity O(n)
function sum2(n) {
  if (n === 0) {
    return 0;
  }
  return n + sum2(n - 1);
}

//Binary search recurses logn layer O(logn)
var search = function (nums, target) {
    return search_interval(nums, target, 0, nums.length - 1)
};

function search_interval(nums, target, left, right) {
    if (left > right) {
        return -1
    }
    let mid = left + Math.floor((right - left) / 2);
    if (nums[mid] === target) {//Determine the target value and the size of the intermediate element
        return mid
    } else if (nums[mid] < target) {//Recursive search for target element
        return search_interval(nums, target, mid + 1, right)
    } else {
        return search_interval(nums, target, left, mid - 1)
    }
}

//Fibonacci Number: the recursive method seeks Fibonacci number. It recurses n layers in total. The height of the binary tree is n, which can be known from our basic knowledge,
//A binary tree with height n can have at most 2^n - 1 nodes, that is, the number of recursive procedure function calls, so the time complexity is O(2^n).
//We can see that the recursive tree contains a lot of repeated calculations.
//0, 1,1,2,3 ...
var fib = function (N) {
  if (N == 0) return 0;
  if (N == 1) return 1;
  return fib(N - 1) + fib(N - 2);
};

Time complexity optimization

  • Adopt a better algorithm: for example: 1+2+3...n sum from 1 to N, direct loop method, for I - > n: sum + = I, we can also use the sum formula: n(n+1)/2. For example, some problems can be found by dichotomy.
  • Space changes time. Time is precious. We calculate a very time-consuming task, which may take a long time. A sudden power failure or an unexpected situation may lead to a very large loss. Space is cheap. At most, we can solve it by buying servers with larger memory. There are many examples in the following chapters, For example, set or map is used to speed up the search, and binary search tree or dictionary tree is used to speed up the search of string.

An example of time complexity analysis

There is an array of strings. Each string in the array is sorted alphabetically, and then the whole string array is sorted in dictionary order. Find the time complexity of the whole operation.

If I say the time complexity is O(n*nlogn + nlogn) = O(n^2logn), right? Take time to think about it.

Let's analyze. Suppose the length of the longest string is s and there are n strings in the array, sort O(slogs) for each string, sort O(n * slogs) for each string in the array alphabetically, and sort O(s * nlogn) for the whole string array according to the dictionary. Therefore, the final time complexity is O(n * slogs) + O(s * nlogn) = O(nslogs + nslogn) = O(ns * (logs+logn))

Spatial complexity

Space complexity refers to the storage space occupied by the algorithm during operation. Space complexity is recorded as S(n), which is still represented by large O. Using the spatial complexity of the program, we can estimate in advance how much memory the program needs to run.

Common spatial complexity

  • One dimensional array space, if n elements are stored, the space complexity is O(n)
  • Two dimensional array space, a total of N arrays, each array stores n elements, space complexity O(n^2)
  • Constant space complexity O(1)

Recursive spatial complexity

//O(1)
function sum1(n) {
  let ret = 0;
  for (let i = 0; i <= n; i++) {
    ret += i;
  }
  return ret;
}

//O(n) recurses n layers, and the recursive stack space is the complexity of O(n)
function sum2(n) {
  if (n === 0) {
    return 0;
  }
  return n + sum2(n - 1);
}

//O(logn) recurses the logn layer, and the recursive stack space is the complexity of O(logn)
var search = function (nums, target) {
    return search_interval(nums, target, 0, nums.length - 1)
};

function search_interval(nums, target, left, right) {
    if (left > right) {
        return -1
    }
    let mid = left + Math.floor((right - left) / 2);
    if (nums[mid] === target) {//Determine the target value and the size of the intermediate element
        return mid
    } else if (nums[mid] < target) {//Recursive search for target element
        return search_interval(nums, target, mid + 1, right)
    } else {
        return search_interval(nums, target, left, mid - 1)
    }
}

Tags: Java Javascript

Posted on Sat, 20 Nov 2021 23:53:20 -0500 by parth