Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Case Analysis of time complexity and Space complexity of Java

2025-01-29 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly explains "Java time complexity and space complexity case analysis", interested friends may wish to take a look. The method introduced in this paper is simple, fast and practical. Next let the editor to take you to learn "Java time complexity and space complexity case analysis" it!

1. Algorithm efficiency

There are two kinds of algorithm efficiency analysis: the first is time efficiency and the second is space efficiency. Time efficiency is called time complexity, while space efficiency is called space complexity. Time complexity mainly measures the running speed of an algorithm, while space complexity mainly measures the extra space needed by an algorithm. in the early stage of computer development, the storage capacity of the computer is very small. So I care a lot about space complexity. However, with the rapid development of the computer industry, the storage capacity of computers has reached a very high level. So now we no longer need to pay special attention to the spatial complexity of an algorithm.

Second, time complexity 1. The concept of time complexity

The time spent by an algorithm is directly proportional to the number of sentences executed, and the number of basic operations in the algorithm is the time complexity of the algorithm. That is to say, when we get a code to look at the time complexity of the code, the main thing is to find out how many times the code that executes the most statements has been executed.

two。 Asymptotic representation of large O

Look at the picture and analyze:

As the value of N becomes larger and larger, the values of 2N and 10 can be ignored.

In practice, when we calculate the time complexity, we do not have to calculate the exact number of execution, but only about the number of execution, so here we use the asymptotic representation of large O.

Big O symbol (Big O notation): a mathematical symbol used to describe the progressive behavior of a function.

1. Replace all the addition constants in the run time with the constant 1.

2. In the modified number of runs function, only the highest order items are retained.

3. If the highest order term exists and is not 1, the constant multiplied by the item is removed. The result is a big O-order.

From the above, we will find that the progressive representation of large O removes those items that have little impact on the results, and clearly indicates the number of execution times.

In addition, some algorithms have the best, average and worst-case time complexity:

Worst-case scenario: maximum number of runs of any input size (upper bound)

Average: expected number of runs of any input size

Best case: minimum number of runs of any input size (lower bound)

For example: search for a data x in an array of length N

Best case: find it once

Worst-case scenario: n times found

Average situation: Number2 times found

In practice, the general situation is concerned with the worst-case operation of the algorithm, so the time complexity of searching data in the array is O (N).

Computational time complexity

Example 1:

The basic operation is performed 2N+10 times. By deducing the large O-order method, we know that the time complexity is O (N).

Example 2:

The basic operation is performed for N times, there are two unknowns M and N, and the time complexity is O (Null M).

Example 3:

The basic operation has been performed 100 times. By deducing the large O-order method, the time complexity is O (1).

Example 4: calculating the time complexity of bubble sorting

The basic operation is performed the best N times and the worst (N* (NMur1)) / 2 times. By deducing the large O-order method + the time complexity is generally the worst, the time complexity is O (N ^ 2).

Example 5: time complexity of binary search

The basic operation is best performed 1 time, the worst O (logN) times, and the time complexity is O (logN) ps:logN. In the algorithm analysis, the base number is 2 and the logarithm is N. Some places will be written as lgN. (it is recommended to explain how logN is calculated by origami search) (because half of the inappropriate values are eliminated by dichotomy search each time, two points at a time are left: nUnix 2, 2 dichotomies: nUnix 2, 2, 2, 2, 2, 4)

Example 6: calculating the time complexity of factorial recursion

The time complexity of recursion = the number of recursions * the number of times each recursion is executed

Through calculation and analysis, it is found that the basic operation is recursive N times, and the time complexity is O (N).

Example 7: calculating the time complexity of Fibonacci recursion

Through calculation and analysis, it is found that the basic operation is recursive 2 ^ N times, and the time complexity is O (2 ^ N).

Rules:

2 ^ 0 + 2 ^ 1 + 2 ^ 2 + 2 ^ 3. 2 ^ (n-(nmur1))

Summation of proportional series

A1 represents the first term, and Q is proportional to 2 ^ 1 (1-2 ^ n) /-1, which is equivalent to 2 ^ n + 1, so the time complexity is O (2 ^ n).

Third, space complexity

Space complexity is a measure of the amount of storage space temporarily occupied by an algorithm during its operation. Space complexity is not how much bytes space the program takes up, because it doesn't make much sense, so space complexity counts the number of variables. The calculation rules of space complexity are basically similar to the practical complexity, and the large O asymptotic representation is also used.

Example 1: calculating the space complexity of bubbling sort

Constant extra space is used, so the space complexity is O (1).

Example 2: calculating the space complexity of Fibonacci

N spaces are opened up dynamically, and the space complexity is O (N).

Example 3: calculating the space complexity of factorial recursion

Recursively called N times, opened up N stack frames, each stack frame uses a constant space. The space complexity is O (N).

At this point, I believe you have a deeper understanding of "Java time complexity and space complexity case analysis". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report