In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-29 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
Complexity Analysis of Data Structure and Algorithm Learning Notes
We all know data structure and English, just like the two legs of programmers; only continuous accumulation, learning, with strong "legs" can go further and further; in the field of data structure and algorithm, I have to admit that I am a rookie; need constant learning; in the process of learning, there will often be some of my own views, and other people's unique insights; I will take notes one by one in order to progress;
Complexity Analysis: What is Complexity Analysis?
1. Data structure and algorithm solution is "how to make the computer solve problems faster and more space-saving," while time and space complexity, as the essence of data structure and algorithm, intuitively explain how fast the code is and how much it saves.
2. We can evaluate the performance of data structures and algorithms in terms of execution time and footprint, also known as space complexity and time complexity, collectively referred to as complexity.
3. Complexity describes the growth of algorithm execution time (or footprint) versus data size.
Why complexity analysis?
1. The unstable factors of the test environment (such as the same code, i7 is much faster than i3), the test scale has a great impact on the test results (some algorithms are more suitable for large-scale data), complexity analysis has the characteristics of independent execution environment, low cost, high efficiency, easy operation and strong guidance.
2. Mastering complexity analysis will enable you to write code with better performance, which will help reduce the cost of system development and maintenance.
How to perform complexity analysis? 1. large O notation
1) The execution time T(n) of all codes is proportional to the number n of executions of each line of code.
T(n) = O(f(n))
where T(n) represents the total execution time of the algorithm, f(n) represents the total number of executions per line of code, and n often represents the size of the data.
Large O time complexity does not specifically represent the real execution time of the code, but represents the trend of the code execution time as the data size increases, also known as progressive time complexity, referred to as time complexity,
Constant order, low order, and coefficients do not actually have a decisive effect on this growth trend, so they are ignored in time complexity analysis.
2. complexity analysis rule
1) Single-segment code looks at high frequencies: for example, loops.
2) Multi-segment code takes the maximum: For example, there are single loops and multiple loops in a segment of code, so take the complexity of multiple loops.
3) Nested code multiplication: such as recursion, multiple loops, etc.
4) Multi-scale addition: For example, if the method has two parameters to control the number of cycles, then the complexity of the two is added.
4. Common complexity levels?
Polynomial order: As the data size increases, the execution time and space occupation of the algorithm increase in proportion to the polynomial. include, O(1)(constant order), O(logn)(logarithmic order), O(n)(linear order), O(nlogn)(linear logarithmic order), O(n^2)(square order), O(n^3)(cubic order)
Non-polynomial order: As the data size grows, the execution time and space consumption of the algorithm explode, and this type of algorithm performs poorly. O(n!), O (n!), O(n!) (factorial order)
5. How to master the complexity analysis method?
The key to complexity analysis lies in more practice.
6. Best, worst, average, and average time complexity 1. Concept:
1. Worst-case time complexity: The time complexity of code execution in the best case.
2. Best case time complexity: The time complexity of code execution in the worst case.
3. Average Time Complexity: Expressed as the weighted average of the number of times the code executes in all cases.
4. Average Time Complexity: In all complexity cases of code execution, most of them are low-level complexity, and some cases are high-level complexity and have temporal relationships. Individual high-level complexity can be evenly spread to low-level complexity. Basically spreading the result evenly is equivalent to low-level complexity.
Why did you introduce these four concepts?
1. The time complexity of the same code will vary in different cases. In order to describe the time complexity of the code more comprehensively and accurately, these four concepts are introduced.
2. It is only necessary to distinguish these four complexities when code complexity differs in magnitude in different situations. In most cases, there is no need to analyze them differently.
How to analyze average and average time complexity?
1. average time complexity
If the complexity of the code differs in magnitude in different cases, it is expressed as the weighted average of the number of times the code is executed in all possible cases.
2. averaging time complexity
It is used when two conditions are met: 1) the code is low-level complexity in most cases and high-level complexity in very few cases; and 2) the occurrence of low-level and high-level complexity has a temporal pattern. Amortization results are generally equal to low-level complexity.
Some people say that we will conduct performance testing before the project, and then do the time complexity and space complexity analysis of the code. Is it unnecessary? Is it a waste of time to analyze the time complexity and space complexity of each code?
I don't think it's unnecessary. Progressive time and space complexity analysis provides us with a good direction for theoretical analysis, and it's host platform-independent. It allows us to have a general understanding of our program or algorithm. It allows us to know, for example, how efficient the program is in the worst case. It also provides a good bridge for us to communicate. We can say that the time complexity of algorithm 1 is O(n). The time complexity of algorithm 2 is O(logN), so we immediately have a perception of the "efficiency" of different algorithms.
Of course, progressive time and space complexity analysis is only a theoretical model, which can only be provided for rough estimation analysis. We cannot directly conclude that the O(logN) algorithm must be better than O(n). For different host environments, different data sets, and different data sizes, the real performance may be different in actual applications. Personally, it is necessary to conduct certain performance benchmark tests for different actual situations. For example, horizontal benchmarking is performed on a unified batch of mobile phones (the same hardware, system, etc.), and then the most suitable algorithm for a specific application scenario is selected.
To sum up, progressive time and spatial complexity analysis and performance benchmark testing do not conflict, but complement each other, but a low level of time complexity program has a great possibility to be better than a high level of time complexity program, so in practical programming, always pay attention to the theoretical time, spatial degree model is conducive to producing efficient programs, at the same time, because progressive time and spatial complexity analysis only provides a rough analysis model, So it doesn't waste too much time. The point is to have this kind of complexity analysis thinking when programming.
Welcome everyone to pay attention to the public number, irregular dry goods, only valuable output
Author: Dawnzhang
Source: www.cnblogs.com/clwydjgs/p/9718754.html
Copyright: This article belongs to the author
The author of this article is responsible for the investigation of the case. The author of this article is responsible for the investigation of the case
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.