First, define the time complexity:
When analyzing the algorithm, the total execution times t (n) of the statement is a function of the problem scale n, and then analyze the change of T (n) with N and determine the order of magnitude of T (n). The time complexity of the algorithm, that is, the time measurement of the algorithm,
Record as T(n) = 0(f(n)).
It means that with the increase of problem scale n, the growth rate of algorithm execution time is the same as that of F (n), which is called the asymptotic time complexity of the algorithm, which is called time complexity for short, where f (n) is a function of problem scale n;
Say something reasonable.
The formula to measure the complexity of an algorithm must be simpler and faster under the same environmental conditions; Then what we pursue is to optimize those with high time complexity into those with low time complexity;
Calculation time complexity:
Calculate the time complexity of the following formula: f(n) = n^2f(n) = n^2 +2f(n) = 2n^2 +2f(n) = 3n^3 +2
Calculation principle:
1. Constants in algorithm functions can be ignored For example, 1. The following + 2
2. The constant factor of the highest power in the algorithm function can be ignored, such as 2 in front of No. 3: n^2 and 3 in front of No. 4
3. The smaller the highest power in the algorithm function, the higher the algorithm efficiency, and the ^ 2 after the letter n ^ 3. In this way, ^ 2 is more efficient than ^ 3;
Time complexity representation:
Definition of large notation:
When analyzing the algorithm, the total execution times t (n) of the statement is a function of the problem scale n, and then analyze the change of T (n) with N and determine the magnitude of T (n). The time complexity of the algorithm is the time measurement of the algorithm, which is recorded as: t (n) = O (f (n)); It means that as the problem scale n increases, the growth rate of the algorithm execution time is the same as that of F (n), which is called the asymptotic time complexity of the algorithm, or time complexity for short, where f (n) is a function of the problem scale n, such as the function of 1 2 3 4 fn listed above;
Important definitions:
Execution times = execution time
Generally, with the increase of input scale n, the algorithm with the slowest growth of T (n) is the optimal algorithm;
So how to deduce the big O notation?
(in fact, it's similar to the above principle)
1. Replace all addition constants in the running time with constant 1;
2. In the modified run times, only high-order items are retained
3. If the highest order term exists and the constant factor is not 1, remove the constant multiplied by this term;
Direct example:
f(n) = 3..........................................O(1)f(n) = n +2...........................O(n)f(n) = 2n^2 +2........................O(n^2)f(n) = 3n^3 +2........................O(n^3)
Better looking with code:
//You can copy to your idea and run the following public class CountOn { /**example: * f(n) = 3..........................................O(1) * f(n) = n + 2...........................O(n) * f(n) = 2n^2 +2........................O(n^2) * f(n) = 3n^3 +2 * * @param args */ public static void main(String[] args) { long result = function1(100); long result2 = function2(100); long result3 = function3(100); long result4 = function4(100); System.out.println("1 To 100 (Gaussian algorithm) time complexity O(1): " + result); System.out.println("1 Time complexity of sum (sequential addition algorithm) to 100 O(n): " + result2); System.out.println("100 1100, 2... Sum of 100, time complexity O(n^2): " + result3); System.out.println("100*100*100 The times are added successively from the beginning, with time complexity O(n^3): " + result4); } //Time complexity f(n) = 3........................................ O(1) private static long function1(long num) { return (1 + num) * num / 2; } //Time complexity f (n) = n.............................. O(n) private static long function2(long num) { long n = 100; for (long i = 0; i < n; i++) { num += i; } return num; } //Time complexity f (n) = n^2.................... O(n^2) private static long function3(long num) { long sum = 0; for (long i = 1; i <= num; i++) { for (int j = 1; j <= num; j++) { sum += i; } } return sum; } //Cubic order private static long function4(long num) { long sum = 0; for (long i = 1; i <= num; i++) { for (long j = 1; j <= num; j++) { for (int f = 1; f <= num; f++) { sum ++; } } } return sum; } }
The print result of the run is:
1 To 100 (Gaussian algorithm) time complexity O(1): 5050 1 Time complexity of sum (sequential addition algorithm) to 100 O(n): 5050 100 1100, 2... Sum of 100, time complexity O(n^2): 505000 100*100*100 The times are added successively from the beginning, with time complexity O(n^3): 1000000
Do an exercise to see if you get your ideas:
//Time complexity calculation exercise: //Find a number from the array, return true if found, and return false if not found. What is the time complexity of this code; public boolean findNum(int num) { //9 numbers, maybe n numbers int[] arr = {1, 56, 45, 23, 88, 99, 456, 0, 2}; //for loop of arr array length for (int i = 0; i < arr.length; i++) { if (num == i) { return true; } } return false; }
This time complexity is calculated in three cases:
1. Fortunately, the first element I just found is num ---------------- o (1)
2. Bad luck, just the last one is ----------------------- O(n)
3. Average situation--------------------------------------- O(n)
It's easy to understand. Finding elements one by one is actually similar to the process of adding 1 to n mentioned at the beginning;
Summary:
Understanding time complexity is an introduction to understanding algorithms and one of the criteria for analyzing the relative advantages and disadvantages of an algorithm;