Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Why do we need quantum computing?

2025-04-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >

Share

Shulou(Shulou.com)11/24 Report--

This article comes from the official account of Wechat: ID:fanpu2019, author: innocence

The author presses

In fact, there are a lot of popular science articles about quantum computing, which I have written before, but this article takes a slightly different perspective: why do we need quantum computing? Why has it attracted so much attention in recent years? If we can answer these questions, some people may be relieved that quantum computing is not a rhapsody of scientists, but a product of this era. Just as quantum mechanics and the theory of relativity are the glorious imprints left by mankind in the 21 century, quantum computing may also become another everlasting imprint left by mankind in the 20th century.

We live in the age of computing.

The human desire for computing power is never-ending. Since the knot record, the improvement of computing power has been closely related to the progress of civilization, and the Pythagorean school in ancient Greece even regarded it as truth. Today we are so used to the benefits of computing that most of us ignore its greatness. When we swipe across the screen, enter a keyword, and the search engine pops up the results we want, these operations can be done in a matter of seconds. How many people know how much "computation" is going through behind this? When we were happily streaming the video, how many people knew that the machine was desperately trying to figure out which video to push to you? At a time when the epidemic situation is grim, each of us is cooperating to scan the code and check the nucleic acid, and how many people can feel the great achievements of "computing" in the fight against the epidemic? Now that our computing power is at its peak, machines have conquered go, the last intellectual bastion that humans are proud of, and then they are trying to conquer autopilot and meta-universe. It can be said that we live in an era of computing.

Note of the knot of the Inca civilization: chip's super computing power today is due to a nonlinear element called a "transistor", which is made of silicon, the most ordinary material in nature, but condenses the best wisdom of human beings. It can be found in every corner around us, but it was born in the cleanest dust-free factory. It is changing our lives so quickly, but now we Chinese find ourselves under the control of others. This is the chip.

In top silicon semiconductor chips, tens of billions of transistors follow this binary logic called Boolean algebra. This logic is not efficient, but it is so flexible and universal that after more than 50 years of growth at the exponential rate of Moore's Law, it has wiped out all its rivals and has become almost the only computing tool.

It has been more than 50 years since Moore's Law was proposed, and it is still valid today, and the corresponding computing power has increased exponentially. It is a clich é that Moore's Law will come to an end sooner or later as transistors get smaller and smaller and approach the nanometer scale. What I want to say is that in today's Internet age, even if Moore's Law is in force for a long time, the development of computing power is far from keeping pace with the expansion of data on the Internet. By calculating the amount of information we can mine from the Internet, it will be pitifully small compared with the amount of information actually contained in the Internet. If we think of the data as a mine and the computing power as a mining machine, the mining machine will become smaller and smaller in front of the mine. In this case, the need for new numeracy beyond the current paradigm is looming. In this context, we can understand why companies like Google pay so much attention to quantum computing and do not hesitate to go into the water themselves. Because it owns the mine. Imagine sitting on a gold mine and having no tools but to pick with your hands.

Moore's Law 50 years of quantum computing into reality said so much, the topic finally led to quantum computing. When many people hear the quantum, it is easy to associate it with mysterious phenomena. It is not necessary to know what is both a wave and a particle, or something that moves in an instant. When I talk to people about quantum, I am most afraid of getting caught up in discussions such as nihilism and cognitive theory, because I am actually an experimenter, not a philosopher. I like to look at quantum from a pragmatic point of view: it accurately describes the behavior patterns at the bottom of matter; it is still very accurate. Well, let's see what extraordinary things we can do under quantum rules. Quantum computing is definitely one of the boldest ideas of the last century, because at that time, the ability to control the quantum world was so different from what it is now that the first few important quantum algorithms, including the Shor algorithm and the Grover algorithm, were actually developed by mathematicians-- they studied it as a mathematical toy and never thought about it.

Entering the 21st century, the situation is very different. The 2012 Nobel Prize in Physics was awarded to Serge Haroche and David J. Wineland for their "groundbreaking experiments in measuring and manipulating independent quantum systems". They captured atoms for the first time and used the interaction between light and atoms to manipulate and measure atomic quantum states-which was actually the beginning of ion trap quantum computing. This work opens the door to manipulate and read quantum states and ignites a fire of hope for the physical realization of quantum computing. Since then, qubits, quantum gates, quantum computing, not only stay in the mathematical and theoretical stage.

At the turn of the century, there is a very important breakthrough for the winners of the 2012 Nobel Prize in Physics. The Cai Zhaoshen team of the Japanese Institute of Physics and Chemistry discovered the phenomenon of quantum oscillation on a superconducting "island" for the first time. The biggest difference from the work of Haroche and Wineland is that the quantum system at this time is a "macroscopic quantum system"-macroscopic electrons participate in the whole quantum process. This "superconducting Cooper pair box" is the forerunner of superconducting quantum computing, which is one of the most concerned candidates for quantum computing. The macroscopic quantum system is easy to manipulate, easy to read, and its fabrication process is largely compatible with semiconductor chips, which leads to the explosion of super vitality in the following decade.

Macroscopic qubits: Cooper pair box? source: Nakamura, Y., Pashkin, Y. a. & Tsai, J. S. Coherent control of macroscopic quantum states in a single-Cooper-pair box. Nature 398,786-788 (1999). The early superconducting qubits, including the "Cooper pair box" mentioned above, as well as flux qubits and phase qubits, solved many technical problems related to manipulation, coupling and reading. but they have been trapped by an important indicator-decoherence time (quantum "lifetime"). Decoherence time refers to the characteristic time in which the quantum property of a system disappears and tends to the classical system. We know that no system can be completely isolated, otherwise this system is just like it does not exist, as a qubit capable of "computing", it is even less likely to be isolated, and it must interact with the outside world. otherwise, how can we manipulate it and measure it? The interaction will inevitably lead to the loss of quantum information. Natural particles, such as atoms, can have a long life, and they only have a very weak interaction with photons, which becomes a double-edged sword: because the interaction is weak, the quantum property is very strong; at the same time, because the interaction is weak, it is also difficult for us to manipulate and measure it. This partly explains why Haroche and Wineland won Nobel Prizes for their work-it's really hard.

The situation of superconducting qubits is exactly the opposite. The hyperfine energy levels that make up qubits are caused by the collective behavior of macroscopic quantitative Cooper pairs, which is in a more macroscopic solid system, where the environment is much worse than that of individual atoms. Photons from unknown places, remaining electrons, charge and magnetic field changes caused by external electromagnetic field disturbances will all affect qubits. In addition, it is a macroscopic degree of freedom, so the coupling strength with these external degrees of freedom is also very strong, resulting in the loss of qubit information in a very short time. But because of this, by means of electromagnetic field regulation, we can also manipulate and read them in a very short time, so fast that it is too late to say, "pull up the radish."... "

The question of time for decoherence ushered in a turn for the better in 2007. At that time, scientists in the field have noticed the effect of increasing capacitance on the suppression of charge noise, while Koch of Yale University and you Jianqiang of China have systematically studied the effect of increasing bypass capacitance on decoherence time in Cooper box system and flux qubit system, respectively. The former is the popular transmon qubit. Since then, the decoherence time of superconducting qubits has quickly climbed to the order of 10 microseconds to 100 microseconds, which is a very long time compared with the manipulation time of 10 nanoseconds. Immediately after that, the Martinis group of the University of California, Santa Barbara, quickly proposed a scalable scheme based on transmon qubits and a systematic electronics solution, which laid the foundation for the engineering of superconducting quantum computing. The rest of the story is that this group joined Google and created a "Sycamore" chip for Google, creating a sensational milestone in quantum hegemony. You can open a single issue of this story, press it first.

Google's Sycamore chip (source: wikipedia.org) in short, quantum computing has gradually changed from the toy of mathematicians and the imagination of theoretical physicists to reality. Among them, there are a large number of experimental physicists and engineers' efforts, which are difficult for outsiders. In any case, with these experiments, technological advances and accumulation, we are qualified to talk about the future of quantum computing and brag about how quantum computing will crush traditional computing. Next, blow!

The concept of Weibitt, the god of quantum computing, originated from Shannon's information theory, and data show that the concept was invented by mathematicians earlier (in the 1940s). It is used to represent the minimum information unit under binary algebraic logic. In traditional computers, information is encoded, processed, transmitted and obtained in bits. In the quantum world, the smallest unit of information becomes the qubit, which is also the unit of information coding, processing, transmission and acquisition, but now it is carried out in the quantum field. Logically, it is a coherent superimposed two-state system; physically, it is a distinguishable (quasi) two-level system. Multiple qubits together can form a composite system, and if they can be entangled, it's time to witness miracles.

Claude Shannon, founder of information theory, source: network entanglement, is unique to the quantum world. It hides a very profound physics, so far it can not be fully understood, but we have confirmed its existence through a large number of experiments. Take the composite system formed by two qubits as an example: the system can be in a certain quantum state, and when you look at them as a whole, the system is quantum, but once you look at a particular qubit alone, the system is no longer quantum. In other words, the composite system can only be seen as a whole, and there is no information from its subsystems. Mathematically, the entanglement system opens up a larger direct product space, and the dimension of the direct product space increases exponentially with the number of bits. Here are a few scary numbers: when Number50, the dimension of this space is about equal to the number of calculations in one second of today's most advanced supercomputers; by nasty 300, the dimension is already more than the sum of all the atoms in the known universe (there are about 1023 atoms in a glass of water).

This terrible dimensional expansion brought about by entanglement provides a huge coding space for computing problems, so that some problems can seek more efficient solutions in higher dimensions. After more than a hundred years of development, traditional computers and theories have been able to solve many problems efficiently, but there are still many problems that can not be solved, such as weather forecasts, stock prices, cancer drugs. If these problems can be calculated accurately, then our world will become particularly beautiful and perhaps very boring. For example, we can accurately calculate how much Chinese football will lose in the next game. Unfortunately, quantum computing cannot solve these problems either. Okay, then why are we working so hard?! Don't worry, we have found that some problems can be solved with amazing efficiency in the framework of quantum computing, and these problems are of great significance.

One of them is the famous Shor algorithm. On today's Internet, we browse the web and enter the user name and password, how can we ensure that we will not be peeked at by others? How can we prevent others from stealing our bank card passwords? Some people say, cover it. In fact, on the Internet, without the protection of an encryption system, this information is almost transparent. Another feature of the Internet is that information can be transmitted instantly to any corner of the earth: the person who peeps at your password may be in Mauritius drinking coconut water with his feet buttoned. Traditional point-to-point encryption is not suitable for the Internet, with the increase in the number of nodes, just storing passwords will be a disaster. An asymmetric encryption system, RSA cipher, solves this problem effectively. Asymmetry means that the keys used for encryption and decryption are different: a private key for decryption and a public key for encryption. The public key is public and can be obtained by anyone. If Li Si wants to send an indescribable information to Zhang San, he needs to encrypt it with the public key published by Zhang San. After Zhang San receives it, he opens it with a private key and can enjoy it. At this time, if a Wang Wu secretly covets these materials, I'm sorry, although he can get a public key in his hand, it can't be opened without a private key. Because anyone who wants to communicate with Zhang San can share a public key, this encryption system greatly saves the key resources needed.

This encryption system has protected the Internet for many years and rarely makes mistakes. Its encryption principle stems from a mathematical discovery: the principle of inseparability of large numbers. Two known large prime numbers, multiply them to get a larger number, a careful junior high school student can calculate the result. But on the other hand, I'll tell you the result of the multiplication and ask you which two prime numbers multiply. Top mathematicians have to be dumbfounded. At present, the most proud achievement of mankind is the cracking of RSA-768. Please see:

1230186684530117755130494958384962720772853569595334792197322452151726400507263657518745202199786469389956474942774063845925192557326303453731548268507917026122142913461670429214311602221240479274737794080665351419597459856902143413

=

33478071698956898786044169848212690817704794983713768568912431388982883793878002287614711652531743087737814467999489

×

36746043666799590428244633799627952632279158164343087642676032283815739666511279233373417143396810270092798736308917

At present, RSA-1024 and RSA-2048 are widely used, and the following number is an index. Because the difficulty of solving this problem increases with the scale of the problem, modern computers can only respect moral people.

Shor algorithm benefits from the exponential acceleration of quantum Fourier transform, which can solve the above problems under the difficulty of quasi-polynomial. the original cracking time of millions of years is directly reduced to the order of seconds-dimensionality reduction. The strength of the Shor algorithm is terrible, but it will not be a problem in the 20th century: to implement the Shor algorithm, in terms of the technology at that time, it is more difficult than going to Mars.

The situation is different now, and we have already talked about it before. Everyone is scared, because one of the most troubling problems in the password world is that you are never sure if your password has been broken. In addition, passwords that cannot be broken now can be preserved, and even if they are broken 20 years later, they will still be lethal enough. Therefore, the emergence of Shor algorithm, especially the possibility of technical implementation, forces people to actively look for new forms of encryption. China prefers quantum communication, leading the world in this respect, while Americans hold back quantum cryptography, and Europeans do not want to release it. All in all, this is a problem that needs to be solved urgently. If either party solves the cracking law first, the international checks and balances will be broken instantly, and the consequences will be unimaginable.

Another useful quantum algorithm is the Grover algorithm: searching for targets in an unstructured array is N times faster than the classical algorithm, where N is the length of the array. This acceleration ability is small compared to the Shor algorithm, but perhaps this algorithm is more useful, because the search problem is the basis for solving many problems and an important means of mining information. When N is very large, the benefit of this algorithm is very significant. Nowadays, the huge amount of data generated all the time on the Internet does not correspond to this very large N situation?

After a long journey, we have to face the reality: the above two algorithms, and their derivative algorithms, require extremely high manipulation and reading error rates, almost requiring qubits to be perfect and error-free. The problem is that any physical system can make mistakes, and any actual operation is accurate. We can correct errors by creating some redundancy, which is also an important topic in the early traditional computer research process. Interestingly, in today's semiconductor chips, the probability of error is so low that error correction becomes completely unnecessary. Just when these theoretical legacies of error correction are about to be lost, quantum computing is inherited.

Quantum error correction is a major challenge to the realization of quantum computing, which is difficult to achieve in the short term, even if we find topological code error correction techniques such as surface coding, which can reduce the requirement of error correction to the acceptable level of today's technology. This is a very complex intersecting problem of science and engineering. Only when the number of bits reaches the scale of 1000 and the technologies such as manipulation, isolation and reading progress at the same time, maybe we can really face this problem. (see "the next Super Challenge of Quantum Computing")

In the meantime, should we wait patiently for a breakthrough in quantum error correction? Actually, that's not what people do. At present, scientists and engineers in the whole field are paying more attention to "medium-scale quantum computing with noise (NISQ)". This idea is to look for quantum algorithms or quantum simulation methods with practical application value according to the current level of quantum hardware and allowing the existence of noise. Therefore, the current research hotspots are variational quantum algorithms based on classical-quantum hybrid computing (VQE), quantum approximate optimization algorithms (QAOA) and so on. Their applications include quantum chemistry computing, financial portfolio optimization, artificial intelligence and so on. Once we have achieved the quantum advantage in a certain application field, our confidence in quantum computing can continue to attract more funds and talents to join us, and then overcome the difficulties such as quantum error correction.

What a long long road! I won't stop pursuing. Quantum computing is a difficult road, we rushed to the front, but also can not see the way forward. Maybe we will break into the maze and look around with a sword. I am really at a loss. Maybe we will cut through the fog and look at the road ahead. Some people think that this is a contest between countries, but I think it is the sparkle of the human spirit. We may fail, but we will not bow our heads.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report