Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to understand the time and space complexity of web

2025-04-06 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly introduces "how to understand the time and space complexity of web". In daily operation, I believe many people have doubts about how to understand the time and space complexity of web. Xiaobian consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful for you to answer the doubts of "how to understand the time and space complexity of web". Next, please follow the editor to study!

Algorithm is a set of methods used to manipulate data and solve program problems. For the same problem, using different algorithms, the final result may be the same, for example, sorting has the first ten classical sorting and several strange sorting, although the results are the same, but the resources and time consumed in the process will be very different, such as quick sorting and monkey sorting:).

So how should we measure the advantages and disadvantages of different algorithms?

It is mainly considered from the two dimensions of "time" and "space" occupied by the algorithm.

Time dimension: refers to the time it takes to execute the current algorithm, which is usually described as "time complexity".

Spatial dimension: refers to how much memory is required to implement the current algorithm, which is usually described as "space complexity".

O symbol representation with large time complexity

Big O representation: the time complexity of the algorithm is usually expressed by the big O symbol, which is defined as T [n] = O (f (n)). The function T (n) is bounded by f (n) or T (n) is limited to f (n).

If the scale of a problem is n, the time required by an algorithm to solve the problem is T (n). T (n) is called the "time complexity" of this algorithm.

The Landau symbol used in the above formula was first introduced by German mathematician Paul Bachmann (Paul Bachmann) in his 1892 book Analytic number Theory and popularized by another German mathematician, Edmund Landau. The function of Landau symbol is to use simple functions to describe the behavior of complex functions and give an upper or lower bound. When calculating the complexity of the algorithm, only the big O symbol is generally used, while the small o symbol and theta symbol in the Landau symbol system are less commonly used. The O here originally used the capital Greek letter, but now all use the capital English letter O; the small o symbol is also the lowercase English letter o, and the theta symbol maintains the uppercase Greek letter theta.

The big O symbol is a relative representation of the complexity of an algorithm.

There are some important and rigorous words in this sentence:

Relative: you can only compare the same things. You cannot compare an algorithm for arithmetic multiplication with an algorithm for sorting the list of integers. However, comparing the arithmetic operations done by two algorithms (one for multiplication and the other for addition) will tell you something meaningful.

Representation: big O (in its simplest form) simplifies the comparison between algorithms to a single variable. The choice of this variable is based on observation or hypothesis. For example, the comparison between sorting algorithms is usually based on the comparison operation (comparing two nodes to determine the relative order of the two nodes). It is assumed that the computational overhead of the comparison operation is very high. But what if the computational overhead of the comparison operation is small and the computational overhead of the swap operation is high? This changes the previous way of comparison.

Complexity (complexity): if it took me 1 second to sort 10000 elements, how long would it take to sort 1 million elements? In this case, complexity is a measure relative to something else.

Common order of magnitude of time complexity

Let's start with a big O understanding of the common order of time complexity:

Constant order O (1)

Linear order O (n)

Square order O (n ²)

Logarithmic order O (logn)

Linear logarithmic order O (nlogn)

O (1)

No matter how many lines of code are executed, other areas will not affect the operation, and the time complexity of this code is O (1).

Void swapTwoInts (int & a, int & b) {int temp = a; a = b; b = temp;} O (n)

In the following code, the code in the for loop executes n times, so the time it takes varies with n, so its time complexity can be expressed as O (n).

Int sum (int n) {int ret = 0; for (int I = 0; I

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report