Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to master time complexity and space complexity

2025-01-29 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/03 Report--

This article mainly explains "how to master time complexity and space complexity". The content of the explanation in this article is simple and clear, and it is easy to learn and understand. let's study and learn "how to master time complexity and space complexity".

Preface

Algorithm is a set of methods used to manipulate data and solve program problems. Algorithm is not only a necessary item for large factories and foreign enterprises to interview, but also a necessary skill for every senior programmer. There are many algorithms to solve the same problem, but different algorithms may have great differences in efficiency and storage space.

So, what indicators are used to measure the advantages and disadvantages of the algorithm? Among them, the efficiency mentioned above can be described by the time complexity of the algorithm, and the storage space occupied can be described by the space complexity of the algorithm.

Time complexity: used to evaluate the time it takes to execute the program, you can estimate the extent to which the program uses the processor.

Space complexity: used to evaluate the memory space occupied by the execution program, you can estimate the extent to which the program uses computer memory.

In practice or in the interview, we should not only be able to write a specific algorithm, but also understand the time complexity and space complexity of the algorithm, so that we can evaluate the advantages and disadvantages of the algorithm. When the time complexity and space complexity can not be satisfied at the same time, it is necessary to choose a balance point.

An algorithm usually has three cases: the best, the average and the worst, and we generally focus on the worst. The worst case is the upper bound of the running time of the algorithm. for some algorithms, the worst case occurs more frequently, which means that the average situation is as bad as the worst case.

Generally speaking, time complexity is more likely to cause problems than space complexity, and more research is on time complexity. If there is no special explanation in the interview, it is also about time complexity.

Time complexity

To obtain the time complexity of the algorithm, the most intuitive idea is to run the algorithm program over and over again, and it can be obtained naturally. However, in practice, it is often limited by the test environment, data scale and other factors, so the direct testing algorithm is either difficult to implement or has a large error, and in theory, it is not necessary to test every algorithm, only to find an evaluation index. you can get the basic trend of the time consumed by the execution of the algorithm.

Time frequency

In general, the time spent by an algorithm is proportional to the number of code statements executed, and the more statements the algorithm executes, the more time it takes. We call the number of sentence execution in an algorithm time frequency, which is recorded as T (n).

Progressive time complexity

In the time frequency T (n), n represents the scale of the problem. When n is constantly changing, T (n) will also change with it. So, if we want to know what kind of law T (n) will show when it changes with n, then we need to introduce the concept of time complexity.

In general, the number of repetitions of the basic operation of the algorithm is a function of the problem size n, that is, expressed by the time frequency T (n). If there is a function f (n) such that the limit value of T (n) / f (n) is a constant of non-zero when n tends to infinity, then f (n) is a function of the same order of magnitude of T (n), denoted by T (n) = O (f (n)), and O (f (n)) is called the asymptotic time complexity of the algorithm, which is called time complexity for short.

The progressive time complexity is represented by an uppercase O, so it is also called the large O representation. The time complexity function of the algorithm is T (n) = O (f (n)).

T (n) = O (f (n)) denotes the existence of a constant C such that T (n) ≤ C * f (n) always exists when n tends to positive infinity. To put it simply, T (n) is about as large as f (n) when n tends to positive infinity. That is to say, when n tends to positive infinity, the upper bound of T (n) is C * f (n). Although there is no stipulation for f (n), it usually takes the function as simple as possible.

The common time complexity is O (1) constant type, O (log n) logarithmic type, O (n) linear type, O (nlog n) linear logarithmic type, O (N2) square type, O (n3) cubic type, O (nk) k power type, O (2n) exponential type.

The above picture shows the growth trend of different types of functions. with the continuous increase of the problem size n, the time complexity increases, and the execution efficiency of the algorithm is lower.

The time complexity of common algorithms from small to large is as follows: neighbor (1)

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report