Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to understand time and space complexity of algorithms and data structures

2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/03 Report--

This article introduces the knowledge of "how to understand time and space complexity of algorithms and data structures". In the operation of actual cases, many people will encounter such a dilemma. Next, let the editor lead you to learn how to deal with these situations! I hope you can read it carefully and be able to achieve something!

Write at the front

Some people may complain that what is the use of learning algorithms? at most, they can be used when interviewing a big factory, and the interview algorithm of a big factory is just a stepping stone to the strong. I don't go to the big factory and I don't have to learn it. Is that really the case?

Certainly not, in the development of the computer industry, whether front-end or back-end, the algorithm is a stumbling block to progress. It can be said that no algorithm will never become a qualified senior engineer. If you want to enter a big factory, you really need to know some algorithms. But it is not just for an interview, it is closely related to our program. Some people say that the front-end does not need algorithms? Where do you put the famous virtual DOM (Virtual DOM)? it is an embodiment of algorithms and data structures. Some people may say, but I don't write virtual DOM. Well, you must want to make money. Take the technical route and want to get a high salary. Algorithm is the cornerstone.

There are many algorithm articles and courses on the Internet, such as learning algorithm data structure in 7 days, mastering algorithm and data structure in 30 days and so on. Don't be silly. Algorithms have no shortcuts and can only be accumulated by us little by little. The first thing you need to do is to believe that you are not a genius. Secondly, you keep doing exercises every day. When time is tight, you can do four or five questions a week. The best speed is one question a day. If you have plenty of time or proficiency in the later stage, you can even brush a few questions a day, even if very smart people do not touch the algorithm for a period of time, they will still forget it, so it is important to continue.

Next, let's learn the algorithm from scratch. If you haven't brushed the algorithm, this article will show you what the algorithm is and what the data structure is. While brushing the algorithm, we need to know whether our solution is good enough or not. And judging the high quality solution will be reflected through the two dimensions of time and space. This article will focus on introducing that these are the necessary knowledge before brushing the algorithm. It would be great and honored if you can enlighten your algorithm path. If you already know this knowledge, you might as well read it quickly and see if you can expand your algorithm knowledge. Of course, if there are any mistakes, you can also guide me. It's a great honor.

What is the data structure?

In fact, the data structure is the way that the program stores and organizes the data. A data structure is organized by the data elements in the program according to some logical relationship, and is the combination of several data elements.

The data structure is the basic unit of processing data in the program, which is used as a whole in the program.

For example, a number 1 or a character A, which is a data structure

For example, a date of December 14, 2020, which consists of three formats, is also a data structure.

For example, an array [1, 'asides,' December 14, 2020] is a combination of numbers, characters, and dates, and is also a data structure.

Generally speaking, data composed of a certain format are data structures. our more common data structures are strings, arrays, objects, stacks, stacks, linked lists, trees, hash tables and so on. You may not understand some of these data structures. Don't worry, you don't need to know what they are right away. For the front end, we use JavaScript, a language we love and hate. There are very few data structures in itself, so it is important to know that structures such as linked lists, trees and so on are simulated by objects.

In many programming, the choice of data structure is a basic design consideration. The difficulty of system implementation and the quality of system construction depend heavily on whether the optimal data structure is selected. Choosing the optimal data structure can effectively improve the efficiency of operation and save the use of storage space, but you should know that there is no perfect data structure. Each data structure has its limitations and advantages, so it is necessary to choose the appropriate data structure according to the requirements.

What is an algorithm?

What is an algorithm? we all know that 1-2-3-6, why is it equal to 6? you might say, 1 plus 2 equals 3, and two 3 add up equals 6. This is the common sense of mathematics that primary school students can know. It is an algorithm in a broad sense.

In fact, the algorithm is the complete step description of solving a problem, which refers to the accurate and complete step description of completing a task.

The design of the algorithm often depends on the data structure, and the implementation of the algorithm depends more on the data structure adopted.

The algorithm for raising a problem is a process from abstract to concrete.

Analyze the problem, select the data structure, and get the preliminary solution.

Break up the solution reasonably and organize it into many steps

Select reasonable loop variables for repetitive steps

Succinctly describe algorithms using natural languages that can be easily converted into programs

Now that we know what an algorithm is, let's look at time and space complexity and measure the pros and cons of different algorithms. We usually look at it from two dimensions.

Time dimension: refers to the time it takes to execute the code, that is, time complexity

Spatial dimension: refers to the space consumed by code execution, that is, space complexity.

Then we begin to analyze the complexity of time and space step by step, let's start with time complexity.

Time complexity

Before we talk about time complexity, we must first understand the concept of code execution times, which can also be called statement frequency or time frequency, expressed by T (n).

Let's use examples to illustrate step by step. First, let's look at the function fn1.

Function fn1 () {console.log ("end of sentence") console.log ("isboyjc")}

Let's see how many times the statements in this function will be executed.

Obviously there are only two statements inside this function, namely console.log ("end of sentence") and console.log ("isboyjc"), so we say that the number of code execution in this function is 2.

Let's look at the function fn2.

Function fn2 (n) {for (let I = 0; I < n; iTunes +) {console.log (end of sentence)}}

Let's first look at several executable statements in the function fn2

Let I = 0 I < Niqi + console.log ("end of sentence")

Let's assume that n = 3, and then take it in turn to see how many times each execution statement has been executed.

Let I = 0 this declaration statement is executed only once on the first declaration of the for loop

I < n the number of times of execution of this statement varies according to the size of the formal parameter n, when n = 3, that is, the number of times of execution of this statement is n + 1, that is, the number of times of execution of this statement is n + 1.

The number of times of execution of this statement also varies according to the size of the formal parameter n. When n = 3, it will be executed, that is, n times.

Console.log ("end of sentence") the number of times this statement is executed still varies according to the size of the formal parameter n. If n = 3, it will be executed 3 times, then the statement will be executed n times inside the function.

1 + (n + 1) + n + n = (3n + 2)

Then the function fn2 is executed for 3n + 2 times.

The total number of times of execution of a piece of code is usually expressed by T (n), so the total number of times of execution of the above two functions called fn1/fn2 is

T (n) = 2 / / fn1 T (n) = 3n + 2 / / fn2

The n above refers to the size of the problem, that is, the size or quantity of input data, and you can also simply understand that T is the function itself, and n is the parameter, that is to say.

The function fn1 is executed 2 times in any case

The total number of times the function fn2 executes will change according to the size of n.

Let's think about it, can we use the total number of times a piece of code is executed to measure the speed of execution?

Of course, the answer is no. When the number of contemporary lines of code is relatively large, it is too troublesome for us to calculate the total number of execution times one by one, such as when applying a function in a function or a loop in a loop. It is very troublesome to accurately calculate the number of execution times.

Therefore, in the algorithm, we generally use the simplified estimate of T (n) to express the speed of code execution, usually we use the larger letter O, that is, the big O representation, because it is estimation, it represents the trend of code execution time changing with the increase of data size, so it is also called progressive time complexity (asymptotic time complexity), or time complexity for short.

After clarifying this concept, we can calculate the time complexity, or use the above two fn1/fn2 function examples.

/ / fn1 T (n) = 2 / / fn2 T (n) = 3n + 2

First of all, let's take a look at the function fn1, whose total number of execution is 2, a constant (constant level). That is to say, whenever the total number of execution of this function is 2, it is a constant value, so we can directly estimate it as 1 when we use the time complexity O to express it, that is, the time complexity is O (1).

Let's take a look at the function fn2. Its number of execution T (n) is 3n + 2, that is, the constant * n + constant. Here we can completely regard it as a constant * n and a + constant. With the increase of n, only the former part will change, and the latter will remain the same, so we can ignore the constant which is invariant later when expressing the time complexity, namely the constant * n. And the passing constant in the constant * n can also be regarded as 1 directly, or ignore this as the constant of the coefficient, then we can finally get that the time complexity of the function fn2 is n, that is, O (n).

"PS: knowing that someone may have given the math knowledge back to the teacher, so explain."

"constant:" A constant refers to a normal value, which can be simply understood as a fixed value.

* * coefficient: * * coefficient refers to the number factor in the monomial of an algebraic expression, for example, if a = 1 * a, its coefficient is 1 ~ 2 * b, then its coefficient is 2.

Let's take another example, for example, the following polynomial refers to the T (n) of a function and calculates its time complexity.

T (n) = 10n ^ 4 + 100n ^ 2 + 1000

In fact, for polynomials, we only need to keep the highest power, that is to say, the highest power of n. In this example, we retain the fourth power of n, and the coefficients and constants can be ignored. The final time complexity is O (n ^ 4).

"conclusion:"

When T (n) is constant, the time complexity is O (1), on the contrary, the time complexity is O (keep the highest term of T (n) and remove the coefficient of the highest term).

Next, let's look at a few examples to determine the time complexity of the next few pieces of code.

"example 1:"

Function fn01 () {console.log ("look at this") console.log ("this is an output") console.log ("ha ha") let a = 1 return a}

There is only one statement in the above function fn01, which is executed for 5 times without change. The time complexity is O (1), which is constant time complexity.

"example 2:"

Function fn02 (n) {for (let I = 0; I < n; iTunes +) {console.log ("is this an output?")}}

As above, the function fn02 is the same as the example fn2 above, a loop, the time complexity is O (n), which is linear time complexity.

"example 3:"

Function fn03 (n) {for (let I = 0; I < n; iTunes +) {console.log ("outer loop") for (let j = 0; j < n; jacks +) {console.log ("inner loop")}

This problem is quite different from the above. First, we look at the internal loop alone. The internal loop will probably execute n times, and then the external loop will execute n times. Finally, it is n * n = n ^ 2, that is, the time complexity of the function fn03 is O (n ^ 2). This is the square time complexity. If it is a three-layer loop, that is, when the time complexity is O (n ^ 3), that is, the cubic time complexity.

From here, we can see how many layers of loops there are in a piece of code, and the time complexity is to the power of n, so try to avoid the nesting of multi-layer loops.

"example 4:"

Let's take a look at the following function fn04

Function fn04 (n) {for (let I = 0; I < n; iTunes +) {console.log ("outer loop") for (let j = 0; j < n; jacks +) {console.log ("inner loop")}} for (let I = 0; I < n; iTunes +) {console.log ("")}

There is a double loop and a single loop in this function, that is, the number of execution is about n ^ 2 + n. According to what we said above, the time complexity of the function fn04 is O (n ^ 2).

"example 5:"

The algorithm is definitely not just the usual loop above, let's take a look at the following one.

Function fn05 (n) {for (let I = 0; I < n; icycle +) {console.log ("outer loop") for (let j = I; j < n; jacks +) {console.log ("inner loop")}

In fact, when we encounter this kind of thing, we can directly bring it in and give it a try to know its law.

When I = 0, the inner loop is executed n times

When I = 1, the inner loop will be executed n-1 times.

When I = 2, the inner loop will be executed n-2 times.

When I = 3, the inner loop will be executed n-3 times.

At this time, we found that every time I increased by 1, the inner loop would be executed one less time, so it would be simple.

When I = n-2, the inner loop is executed twice.

When I = n-1, the inner loop is executed once.

Finally, we can add up the execution times of n inner loops to get the final imprecise total number of execution.

T (n) = n + (n-1) + (n-2) +. + 3 + 2 + 1

As above, this is an arithmetic sequence, well, I know, it will be added.

If a sequence starts from the second term and the difference between each term and its previous term is equal to the same constant, the sequence is called the equidifference sequence, and this constant is called the tolerance of the arithmetic sequence, which is often denoted by the letter d.

For example: 1, 3, 5, 7, 9, etc. (2n-1), the general term formula of the isometric sequence S (n) is: s (n) = S1 + (nMel 1) * d, the first n terms and the formula are as follows

S (n) = n*S1 + n * (n-1) * dmax 2 / / or S (n) = n * (S1 + Sn) / 2

So the result of our calculation is that in our above series, the tolerance d is-1, and we can bring it into the formula. The second formula is simple. If we use the second one, the calculation results are all the same.

/ / n * (S1 + Sn) / 2 n * (n + 1) / 2 = (n ^ 2 + n) / 2 = (1 to 2) n ^ 2 + (1 to 2) n

Finally, we get (1gram 2) n ^ 2 + (1 + 2) n, and remove the coefficient according to the highest term above, that is, the time complexity is O (n ^ 2).

"example 6:"

Let's look at another example.

Function fn06 (n) {for (let I = 1; I < n; I * = 2) {console.log ("isboyjc")}}

It's the same as usual. if you don't know what to think, you can try it with a few parameters and take a look at the rules.

We can use nasty 2, nasty 4, nasty 8, nasty 16 to observe the number of printing times in the loop, of course, you can also run the code directly, the process is not explained too much, let's look at the results directly.

Print 1 time T (n) = 1 nasty 4 print 2 times T (n) = 2 nasty 8 print 3 times T (n) = 3 nasty 16 print 4 times T (n) = 4

The main image produced by the number of execution is the icycle 2 in the loop statement, and each loop I will become twice as much as itself. by observing carefully, we can get such a regular result.

Print 1 time T (n) = 1 / / 2 ^ 1 = 2 n 4 print 3 times T (n) = 3 / / 2 ^ 3 = 8 n 16 print 4 times T (n) = 4 / 2 ^ 4 = 16

According to the above law, it is not difficult to find that 2 ^ execution times = n, that is, 2 ^ T (n) = n, we can find T (n) and adjust it, that is, the logarithm of 2 as the base n, that is, T (n) = log_2 n.

"PS: it's time to do math again."

Logarithm: "if a ^ n = b, that is, the n power of an is equal to b, we find the value of n, then here for convenience, it can be written as log_a b, that is, the logarithm of b based on a, an is the base, b is true, and n is logarithm.

You may be wondering why you only look at the number of prints in the loop without counting all the times of execution, and as I said above, these inherent constants and coefficients are completely negligible. Well, let's push it one last time.

The intermediate output is log_2 n times, and let I = 1 will only be executed once, I

"example 7:"

Finally, let me give you an example.

Function fn07 (let m) {for (I = 0; I < n; I < m) {while (j < m) {console.log ("do you understand") j = j * 2}}

As shown in the figure above, this function has two parameters, corresponding to two loops inside and outside. Let's start with the inner loop. Does it look familiar? In fact, the inner loop is the same as the loop in the above function fn06, except for one for and one while. We will no longer describe the time complexity in the previous question, so the time complexity of the inner loop is O (log n).

Let's take a look at the outer loop, which is also explained above, and the time complexity of the outer loop is O (n).

The two loops are nested and can be multiplied, so the time complexity of the function fn07 is O (n*log n), that is, the linear logarithmic time complexity.

Just like this function output, do you understand it?

"the code word is too tiring. Look at a picture:"

Find a net map (invade and delete), and use the chart to show the common time complexity curve more clearly.

As shown in the figure above, the Abscissa is n, and you can see the increasing trend of the number of operations or execution time with the increase of n.

Common orders of magnitude of time complexity are

Constant order O (1)

Log level O (log n)

Linear level O (n)

Linear logarithmic stage O (n*log n)

Square O (n ^ 2)

Cubic O (n ^ 3)

K power O (n ^ k)

Exponential level O (2 ^ n)

From top to bottom, the time complexity is getting larger and larger, and the execution efficiency is getting lower and lower. Most of the commonly used ones are shown in the chart above.

So in the program or doing exercises, we should try to use the method with low time complexity.

Time complexity ends here. We also list common time complexity. Next, let's take a look at space complexity.

Space complexity

Space complexity is actually an expression of the amount of storage space occupied by an algorithm or a piece of code during operation.

We talked about time complexity above, so space complexity will be much simpler.

The space complexity is S (n), which is also expressed by the large O representation, so let's go straight to the example.

"example 1:"

Function fn001 (n) {for (let I = 0; I < n; iTunes +) {console.log (space complexity)}}

First of all, we know that the space complexity is related to the storage space, and what determines the storage space? it must be a declared variable. Let's go directly to the variable declaration in the function fn001. There is only one I, that is to say, this function only opens up a piece of space for I to use, then the space complexity S (n) is O (1). You may say that I is not changing all the time, yes, it is changing. But no matter how it changes, it is still a number and occupies the same amount of space.

The space complexity is the same as the time complexity. The variable declared in the contemporary code is a constant and does not change according to the change of the input value, so its space complexity is O (1).

"example 2:"

Function fn002 (n) {let arr = [] while (n arr.push -) {arr.push (n)}}

In this example, we declare an array, and we know that various types can be stored in the array. In the loop, we direct the push elements in the array arr according to the size of n, so the array has as many elements as n, so the space complexity S (n) = O (n).

"Space complexity Summary"

In terms of space complexity, only two examples are listed, because in general, we don't need anything else. Generally speaking, the space complexity will only be O (1) / O (n) / O (n ^ 2). The other ones are rare. Of course, there are also some. When we know the time complexity and then analyze the space complexity, it is easy to analyze it, so we will not repeat it too much.

With regard to the analysis of space complexity, in fact, we can start with the declared variables directly. We can see that the variables declared in the function body are analyzed according to the changes in the input values.

In addition, here we do not list recursion, "Please note", recursion is a set of functions, like Russian nesting dolls, there is actually a problem, we know that recursive functions, each layer of recursion will open a recursive stack, each recursive variables and other space consumption will be stored in the recursive stack, this is also a space, whether you declare variables or not. It exists as long as the recursive stack is recursive, that is to say, as long as there is recursion, basically the minimum space complexity is O (n), so we try not to use recursion when we can use iteration.

Time VS space

At the beginning, we said that we mainly evaluate the efficiency of an algorithm from the two dimensions of time and space, but usually in the algorithm, time and space are the relationship between fish and bear's paw. At this time, there may be two solutions to an algorithm problem. One kind of time complexity is low, but the space complexity is slightly higher, the other is vice versa.

What should I do at this time? As you can see, in development, we usually have the advantage of time over space. You think, ah, a program, the user will not care about how much memory is occupied, but he will care about the execution speed of your program when he is using it. Moreover, space is the disk. At this stage, we can spend money on disk capacity expansion, time is not so simple, so in some relative cases, space for time is acceptable.

Although it is feasible to exchange space for time, we can't exchange space for time blindly. We should reduce the complexity of the two dimensions as much as possible. In a few opposing cases, space can be exchanged for time.

When we are brushing the algorithm, we are not done with it. We also have to analyze the corresponding time and space complexity of our problem solutions. We can analyze the complexity of many kinds of problem solutions and compare them to find the optimal solution.

In the interview, if the interviewer gives you an algorithm question, when you know several solutions, you can pretend to ask, do you prefer time supremacy or space supremacy, I can solve it, Ollie, and then he will let you write it all?

Just kidding, try to write a variety of solutions during the interview and explain to the interviewer the differences in time and space complexity of each solution. It's great.

This is the end of the content of "how to understand time and space complexity of algorithms and data structures". Thank you for your reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report