Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to understand the time complexity and space complexity of C language data structure

2025-01-29 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/03 Report--

This article mainly introduces "how to understand the time complexity and space complexity of C language data structure". In daily operation, I believe many people have doubts about how to understand the time complexity and space complexity of C language data structure. The editor consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful for you to answer the question of "how to understand the time complexity and space complexity of C language data structure". Next, please follow the editor to study!

Catalogue

What is the time complexity and space complexity?

1.1 definition of algorithm efficiency

1.2 concept of time complexity

1.3 the concept of space complexity

Second, how to calculate the time complexity and space complexity of common algorithms

2.1 time complexity calculation

2.2 calculation of space complexity

2.3 Quick Down Big O Progressive expression

III. Some special circumstances

What is the time complexity and space complexity? 1.1 definition of algorithm efficiency

There are two kinds of algorithm efficiency, one is time efficiency-time complexity, the other is space efficiency-space complexity.

1.2 concept of time complexity

Time complexity, in short, is that you write a code, how many steps it takes to solve a problem and how long it takes. Let's take a simple analogy: now give 10 numbers and ask to find out where 7 is 1, 2, 3, 4, 5, 6, 7, 8, 8, 9, 10. We asked to write a code, classmate dog egg wrote a violent search, traversing back from the first number in turn, his algorithm to find 7 times, classmate dog left to write a dichotomy search, as long as looking for 2 times, this is a comparison of time complexity

The number of times the basic operation in the algorithm is executed, which is the time complexity of the algorithm.

1.3 the concept of space complexity

Space complexity is a measure of the amount of storage space temporarily occupied by an algorithm during its operation. For example, Chestnut: we now ask to write a code, the dog egg tapped a lot of variables, the program ran, the dog left with very few variables, the program also ran. But the two of them are different in how many variables and how much memory they take up when the code is running. Space complexity, which calculates the number of variables.

Second, how to calculate the time complexity and space complexity of common algorithms

We all use the large O asymptotic representation (an estimation) when calculating the time / space complexity.

2.1 time complexity calculation

Let's take a simple function as an example.

The code is as follows:

Void func1 (int n) {int I = 0; int j = 0; int k = 0; int count = 0; for (I = 0

< n;i++) { for (j = 0;j < n;j++) { count++; } } for (k = 0;k < 3 * n - 1;k++) { count++; }} 试问:该函数如果被调用,要运行多少次? 我们清楚的看出i进去有n次,共有n个i,第一个for结束要运行n^ 2次,第二个for要执行3n-1次,共执行n^ 2+3n-1次 那么我们这里的时间复杂度是否就是n^2+3n-1呢?答案是否的 我们前面说过,时间复杂度和空间复杂度用的都是大O渐进表示法,是一种估算法 我们取的值,是取对表达式中影响最大的那个 我们以n^ 2+3 * n-1这个式子进行举例:设f(n)=n^2+3n-1 n=1,f(n)=1+3-1=3 n=10,f(n)=100+30-1=129 n=100,f(n)=10000+300-1=10299 n=1000,f(n)=1002999 … 很容易发现,对f(n)影响最大的是n^ 2,设g(n)=n^2 n=1,g(n)=1 n=10,g(n)=100 n=100,g(n)=10000 n=1000,g(n)=1000000 … 当n越大,g(n)就越接近f(n) 那么这里的时间复杂度大O渐进表达法写法是这样的:O(n^2) 2.2空间复杂度计算 在学习空间复杂度的求解之前,我们要知道,空间复杂度也是用大O渐进表达法进行求解,我们计算的不是所占空间大小,而是变量的个数。先来看一段代码: 代码如下(示例): #includevoid bubblesort(int*a, int n){ assert(a); for (size_t end = n;end >

For (size_t I = 1p) {int exchange = 0;

< end;++i) { if (a[i - 1] >

A [I]) swap (& a [I-1], & a [I]); exchange = 1;} if (exchange = = 0) break;}}

In the above code, we created three variables, size_t end, int exchange, and size_t I. although our function goes through a lot of cycles, these three variables are used repeatedly, that is to say, the space they occupy is used repeatedly, and the amount of space is unchanged. Here is the difference between time complexity-time is cumulative, space is not cumulative (for time complexity). Each loop is calculated. For space complexity, space can be used repeatedly.

As we said above, the calculation of space complexity also uses the large O asymptotic representation, and for constants, we uniformly use O (1) (see time and space complexity section 1 for details of the large O asymptotic representation).

Ps1:assert is usually used to diagnose the potential BUG in the program. By using assert (condition), when the condition is false, the program ends ahead of time, which is conducive to the positioning of the program BUG.

Ps2:size_t is a type and think of it as long unsigned int

Let's look at another piece of code:

The code is as follows (example):

/ / calculate the space complexity of bubblesort # includelong long Factorial (size_t n) {return N < 2? N: Factorial (n-1) * n;}

This code is a very simple recursive factorial operation, so what is its space complexity? Let's assume that the past nasty 10.

We will do 10 times recursion, each recursion is to create a function stack frame (that is, a space), a total of 10 times, each time the space complexity is O (1). Replace 10 with n, that is, do n recursions, each recursion will create a function stack frame, the space complexity is O (n).

Ps: some friends may ask, isn't the function stack frame destroyed when it goes back recursively? Note: the space complexity here is the "worst case" of the calculation, and no matter what function it is, its stack frame will be destroyed after use, and the space complexity is calculated when it uses the most space.

2.3 Quick Down Big O Progressive expression

1. Constant 1 replaces constants in all addition operations

two。 Keep only the highest order (the idea of high number limit)

3. If the highest order exists and is not constant, the coefficient of the highest order, such as 3 * n ^ 9, is removed and the coefficient becomes n ^ 9.

Let's take a look at two more codes for training.

Code 1 is as follows:

Void func2 (int n) {int I = 0; int k = 0; int count = 0; for (I = 0 I < 3n politics +) {count++;} for (k = 0 ten k < 6 count++;}})

Here f (n) = 3n+6, and its big O asymptotic expression is O (n).

Code 2 is as follows:

Void func3 (int n) {int I = 0; int count = 0; for (I = 0 position I < 1000 position I +) {count++;}}

It can be seen at a glance that it has been run for 1000 times. What does it mean? As I said earlier, the constant 1 replaces the constant in all addition operations, so no matter how large the constant is, as long as you have only one constant, it is represented by O (1).

Some considerations:

O (1) the estimation of time complexity does not change with the change of n. In vernacular, no matter how much n you enter, the efficiency of my algorithm is the same.

The time complexity of O (n) varies with n.

Let's take a popular analogy: if you have a function O (x) = 1, then whatever you want, the value of the function will be 1.

Suppose a function O (X) = x, then the value of the function changes with the x transformation.

III. Some special circumstances

The time complexity of some algorithms is the best, average, and worst case:

Worst-case scenario: maximum number of runs of any input size (upper bound)

Average: expected number of runs of any input size

Best case: minimum number of runs of any input size (lower bound)

Without saying much, give an example:

The code is as follows:

Const char*strchr (char*str, char c) {while (* str! ='\ 0') {if (* str = = c) {return str;} + + str;} return NULL;}

The above code is a simple function to find characters, for example, we now give a string of characters a total of n characters "aaaaba … aaac" (ellipsis omitted a)

Look for a here and find it all at once. If you want to find b, it will be even slower to find c. If you look for d, I'm sorry, you can't find it.

So here's the best-case scenario: find O (1) at once.

Average: O (nplink 2)

Worst case: O (n)

For the worst-case scenario here, some students may want to say why it is O (n). You don't see the worst-case scenario, do you? The explanation here is that you need to find c n times, but if you can't find d, you have to find it n times to make sure you can't find it.

At this point, the study on "how to understand the time complexity and space complexity of C language data structure" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report