In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)11/24 Report--
This article comes from the official account of Wechat: ID:fanpu2019, author: innocence
Quantum computing is a new computing paradigm, which is expected to provide human beings with unimaginable computing power, which will have a far-reaching impact on various fields and even our daily life. With the addition of some technology giants, people seem to feel that quantum computers are not far away from us. This paper tries to answer the following questions: what is quantum computing? What can quantum computing do? How to realize quantum computing? How to use superconductors to make quantum computers? How far are we from the universal quantum computer? I hope that after reading, readers can have a correct understanding of quantum computing and reasonable expectations!
1. Our senses are Newtonian mechanics, but our world is quantum. Humans began to understand the world about 60,000 years ago: perceive it through the eyes, ears, nose, tongue and body, and then build models in the brain. so that we can adapt to this complex and sinister planet and thrive for generations.
Our bodies naturally evolved senses, such as eyes and ears, to perceive macroscopic objects, so when Newton discovered the three laws of mechanics and the law of gravitation more than 300 years ago, people became ambitious and thought they had found the fulcrum to move the earth. It was not until two dark clouds appeared in the physical building more than a hundred years ago that we finally realized that the bottom of the real world is actually quantum. And quantum phenomena are against our common sense.
After another hundred years, as we become more and more adept at quantum manipulation and measurement, scientists finally ask the question of the century: if I were in charge of the quantum world, what could we do that would have been impossible? Using quantum to do calculations is one of the boldest ideas.
2. What is quantum computing? In the computers, mobile phones and tablets we use today, the bottom chips are all on the 01 circuit composed of semiconductor transistors. Logically, we call this unit, which has two states of 0 and 1, and can switch between these two states as bits. In order to distinguish them from the qubits we are about to talk about, let's call them "classical bits". Now all the programs running on the computer are based on a large number of classical bits, carry out various logic gate operations in a certain order, and then give the final results. After understanding the basic operation principle of classical computing, it is not difficult to understand quantum computing. The so-called quantum computing is to replace classical bits with qubits, replace classical logic gates with quantum logic gates, and replace classical measurements with quantum measurements. Of course, we have to replace the classical algorithm with a quantum algorithm. The calculation method produced on such a new system based on the basic principles of quantum mechanics is quantum computing.
Qubit is a quantum two-level system, which has two significant differences compared with classical bits. First of all, a classic bit can represent only two states, 0 or 1. And a quantum bit, because a quantum state can be in a superposition state, it can actually represent an infinite number of states. We can map these states to a sphere with a radius of 1, and any point on the sphere represents a possible state. The ball is called the "Bloch ball" in honor of Bloch (Felix Bloch,1905-1983), the first physicist to invent nuclear magnetic resonance technology.
Classical bit (left) vs qubit (right) entanglement is the difference between a single qubit and a classical bit. If there are a lot of bits, another important difference between them is that the quantum can be entangled. Entanglement is a particularly magical quantum phenomenon, which means that a composite system composed of multiple quantum systems can be in a certain "quantum correlation" state. In these states, we can't get any information just by looking at each particle, and we have to look at them together to get the information in it. Someone once used an analogy: if it is a classic book, we all know how to read it, that is, read it sentence by sentence, page by page, and we will naturally know what it is about. But if the book is quantum, then the situation is very different. If you look at every page, it is full of garbled codes. Only when you put the whole book together will you find that the story is hidden in the connection between the pages.
Let's look at a concrete example. There are two particles, qubit1 and qubit2, each of which have two energy levels, labeled g1e1 and g2e2, respectively. Together, they can be in a state where the wave function is equal to g1e2+e1g2 and multiplied by a normalized constant. If we just look at qubit1, you will find that the probability that it is in G1 and E1 is 50 / 50 and completely random, and the same is true of qubit2 alone. But taken together, we find that it is regular-- if qubit1 is in G1, Qubit2 must be in e2, and vice versa, but it is impossible for qubit1 to be in G1 and qubit2 in G2. This state is actually the famous Bell state, which is a most entangled state.
In order to fully describe a multi-particle entangled state, we need to represent them in a very high-dimensional Hilbert space, whose dimension increases exponentially as the number of bits increases. For example, to represent a two-particle entangled state, we need four parameters (like coordinates) to describe it; to represent a three-particle entangled state, we need eight parameters; four-particle entangled state needs 16; five particles. Sorry, it's a little too much to write down.
The coding space of qubit opening increases exponentially with the number. If the number of particles continues to increase, its state space will grow exponentially. This is a rather terrible number. If N reaches 50, the number of state space will reach the number of operations per second of a current supercomputer, and when N reaches 300, this number will exceed the sum of all the atoms in the universe.
This is the power of the index. Einstein once said that the most powerful weapon in the world is not the atomic bomb, but compound interest plus time. There is a classic story in which the Indians sold the island of Manhattan to an American for $24. More than 300 years later, Manhattan became the most famous business district in New York, worth more than $100 billion. But if these Indians had put the money in the bank at that time, in more than three hundred years, they would not only have bought Manhattan back, but would have left a lot of money!
There is also a famous story about an Indian king who promised to reward the inventor of chess because he liked chess very much. The wise man suggested that as long as you put one grain of wheat in the first box of the chessboard, two grains in the second compartment, four grains in the third compartment, and so on, fill the whole chessboard. The king promised, in fact, only to know that this is an astronomical figure, he will eventually need to come up with 92.2 billion wheat, based on India's grain output at that time, even after three hundred years, the poor king will not be able to afford this reward. This is the disaster of the index. There are many such exponential catastrophic problems in our real life, such as interactive multi-body problem, Internet routing problem, weather prediction, brain neural structure and so on. These problems, even if we fully know the form of the equation and all the parameters, we can not accurately solve, because it is too complex. The system of qubit entanglement can provide an exponentially growing coding space, so can we use this property to solve these exponentially complex problems?
The answer is yes. The scientist who first proposed to use quantum systems to calculate or simulate quantum physics problems was Feynman, but it was just a concept at that time. The algorithms that have been proved to be effective mathematically are two kinds of famous algorithms put forward by mathematicians in the 1990s: Shor algorithm and Grover algorithm. Shor algorithm shows that the solving time of large number decomposition problem can be reduced from exponential level to quasi-polynomial level by using quantum logic circuit. Large number inseparability is the mathematical basis of the most widely used asymmetric encryption system on the Internet, also known as public key encryption system. It has been estimated that it will take more than 1 million years to crack a 2048-digit public key cipher with the best algorithm, while it only takes a few minutes to use the Shor algorithm.
Grover algorithm is a search algorithm, which can reduce the unstructured data search problem from N complexity to root N complexity. Although the acceleration ability is not as good as the Shor algorithm, the search algorithm is a very basic algorithm, which can be mapped to a variety of practical problems. When N is very large, this acceleration effect is also very significant. At present, the huge amount of data generated every moment in the Internet, to find useful information, is corresponding to this kind of N is very large. In addition, the search algorithm can also be used to crack passwords.
In addition to these two kinds of algorithms, quantum computing can also carry out quantum chemical calculations by simulating the Hamiltonian of complex systems to study complex multi-body physical problems that are difficult to study; through the characteristics of quantum entanglement, quantum computing can also quickly calculate the cost function of multi-parameters, so as to improve the efficiency of solving optimization problems, and has potential application value in artificial intelligence.
Quantum computing is not omnipotent!
In this paper, a set graph of computational problems is given. in computational science, all computational problems can be divided into determinable problems and undeterminable problems.
What the computer can solve is only the determinable class problem, that is, the calculable problem. There are three important categories. One is the P problem, which can be solved by Turing machine within the polynomial complexity, so it can be efficiently calculated by traditional computers, such as scheduling problems. The second kind is NP problem, which belongs to uncertain polynomial complexity problem. These problems are easy to verify, but not easy to solve. The P problem is included in the NP problem, and quantum computing is to solve those computational problems other than P, such as the prime factor decomposition corresponding to the Shor algorithm. There is also a class of NP complete problems in NP problems, whose computational complexity increases exponentially, including the classical maximum cut problem, quotient problem and so on. As shown in the figure above, quantum computing cannot solve these problems efficiently at present. Therefore, quantum computing may not be able to solve all problems efficiently. However, in any case, this new computing paradigm can greatly expand the existing computing power and enable us to solve more and more complex problems.
3. How to realize quantum computing? If we know that quantum computing is so promising, then let's move on to the second question: how to implement quantum computing?
Mathematically, quantum computing can be divided into the following steps: first, we should have a perfect set of qubits and be able to initialize them, such as all to the ground state; then encode the initial conditions of the problem into these bits; the next part is to execute the algorithm, which corresponds to a set of quantum gate operations, which can be written as a total unitary matrix U After the implementation of the algorithm, all qubits are measured and the final calculation results are obtained.
It should be noted that because the quantum state collapse caused by quantum measurement is completely random, the above process must be repeated N (N is much greater than 1) times in order to accurately obtain the distribution of the final state 0 and 1. For example, in Google's "quantum hegemony" experiment, the U in the middle is a set of randomly selected quantum gates that have been executed a million times before getting the final result. Of course, 1 million times sounds like a lot, but for a quantum processor, it takes only 200 microseconds to execute one time, and most of that time is actually waiting for qubits to "cool" down. So the total execution time is only 200s. It takes tens of thousands of years to simulate the same calculation with a supercomputer. Recently, Zhang Pan, a scientist in China, used the tensor network method to shorten the simulation time to 5 days, and only 60 GPU clusters were used, which shows that "quantum hegemony" is also relative.
It is a very challenging thing to do quantum computing in physics. Because the actual physical system can not be as perfect as the mathematical model, they will be affected by noise and restricted by various physical conditions. What's more, we have to manipulate and measure extremely fragile quantum states. This is why the theory and algorithm of quantum computing appeared as early as the eighties and nineties of the last century, while the study of experimental physics did not get on the fast track until 2000.
In the real world, there are many systems that can be used for quantum computing, including natural atoms, ion traps, photons, two-dimensional electron gas, NV- color centers, nuclear spins, cold atoms, and superconducting qubits, and so on. These physical quantum systems are very different. Some systems, such as natural atoms, ion traps and photons, can maintain good quantum coherence at room temperature. Superconducting qubits and quantum dots based on two-dimensional electron gas must be at extremely low temperatures close to absolute zero in order to maintain good quantum coherence. However, good quantum coherence is not the only criterion to be a good qubit. Here's what it takes to be an excellent "qubit" and successfully build a practical quantum computer.
These are the five principles put forward by Divincenzo, a scientist at IBM in 2000. Let's call it the Divincenzo Criterion. Let's take a look at one by one:
First of all, we must be able to construct a set, note that not one or two, but a large number of qubits that can be expanded, and be able to characterize them well and determine their Hamiltonian. This actually excludes many candidates, such as nuclear magnetic resonance systems and NV- color centers. Although they are coherent enough and can be well manipulated, they cannot be expanded.
The second is that this set of qubits must be initialized, such as placing all qubits in the ground state. This is self-evident, if the initial state can not be determined, the result must be uncertain.
The third is that qubits must have a long enough decoherence time. Generally speaking, this time must be much longer than the time required to perform the quantum gate operation. Otherwise, when the quantum algorithm is completed, these qubits will be decoherent, and almost all the measured qubits will be noise, and the result will be meaningless.
The fourth point is that a set of universal quantum gate operations, including CNOT gates, C-phase gates and various single-bit gates, must be implemented, which is necessary to perform logical operations. The most important of these is the two-bit gate, also known as the entanglement gate, because the quantum computer with only a single-bit gate is easy to be simulated by classical computers and can not play the role of quantum acceleration.
One last point: the final state of these qubits must be well measured. It is also obvious that if you can't measure it, it's impossible to get the result.
There are few real physical systems that meet these five conditions at the same time. Superconducting quantum computing based on superconducting qubits, because it can be done well on these five points, finally stands out and becomes the most promising technical solution for all the big head companies. Let's see what's good about this qubit.
4. How to use superconductors to realize quantum computation? The superconducting phenomenon has a history of more than one hundred years since it was discovered (see "one hundred years of superconducting road, now reaching room temperature"), which is a very rare physical phenomenon with very steep transition and very robust. BCS theory tells us that superconductivity is caused by the phase transition caused by the collective condensation of all conducting electrons to the ground state in the form of Cooper pairs at low temperatures. This phase transition leads to the opening of an energy gap near Fermi energy, and any low-energy process below this gap can not effectively excite the electronic system, thus providing excellent protection for a variety of low-energy quantum behaviors.
Superconducting state is a typical macroscopic quantum phenomenon, because the particles involved in superconductivity are the collective behavior formed by the high overlap of macroscopic Cooper pairs. This macroscopic quantum property establishes the inherent condition for the superconducting qubits to be easy to couple, easy to manipulate and easy to read. However, to do quantum computing, superconductivity is not enough. The wave function phase of bulk superconductors is constant, and the number of particles tends to be infinite, so it is impossible to provide the energy level separation and nonlinearity required by qubits. Therefore, we must find a way to impose additional constraints on superconductors.
This is about the Josephson junction, the key device of superconducting qubits (see "hard Core Technology behind Quantum Computing: Josephson Parametric Amplifier"). A Josephson junction is a junction device consisting of a very thin insulating layer between two superconductors. The superconductors on the left and right sides have their own phases, and the insulation in the middle separates them, but because of the quantum tunneling effect, the Cooper pairs in the superconductors on both sides have a small probability of passing through this insulator to the other side. This kind of tunneling makes the superconducting wave functions on both sides form a certain interference effect and bring some very strange properties, which are mainly reflected in two aspects: we call it Josephson current relation and voltage relation respectively. in electronics, it is often called DC Josephson effect and AC Josephson effect.
The Josephson current relationship tells us that there can be a superconducting current flowing through the Josephson junction. This current varies sinusoidal with the phase difference between the two superconductors (sometimes expressed by the superconductor), and the maximum can not exceed the Ic, that is, the critical current. Josephson voltage relation means that the rate of change of this phase difference is proportional to the voltage. Obviously, the circuit characteristic of this device is nonlinear because there is a sinusoidal function in the current relationship. This nonlinearity will lead to many strange properties, such as Josephson oscillation. here we only focus on its properties as a nonlinear and lossless circuit element. Looking at the above two formulas together, we find that the Josephson junction behaves like an inductor, but its inductance is not fixed, but varies with the phase difference on the junction. We can think of the phase of a Josephson junction as a particle moving in the "washboard potential". In each pit, the particle oscillates left and right; in the quantum region, that is, when the temperature is low enough, the oscillation is quantized. And because of the nonlinearity of inductance, the spacing of these energy levels is different, which is what we called nonlinearity.
Look, at this point, we find that the Josephson junction, which consists of two superconductors, can provide the separation and nonlinearity of energy levels. But it is not enough to have Josephson knot alone. Because its own capacitance and inductance parameters are often not suitable to use, in order to get a handy qubit, we also need to add a certain amount of conventional capacitance and inductance.
A) superconducting Josephson junction; b) the representation of Josephson junction in the circuit; c) the circuit model of Josephson junction (RSJ model); the following picture shows the variation of Josephson potential energy with phase. The phase can be thought of as a particle rolling on the potential energy curve. In one of the potential wells (near the minimum of potential energy), discrete energy levels will be formed, and the transitions between different energy levels correspond to different phase oscillation frequencies. A qubit can be constructed by using the two lowest energy levels in this nonlinear energy level system. Therefore, the qubits commonly used for quantum computing are actually circuits composed of Josephson junctions and certain capacitors or inductors. Together, they form a very complex hyperfine level structure, but as qubits, we only need to use the two levels with the lowest energy, that is, the ground state and the first excited state. Since the first superconducting qubit appeared at the end of the last century, after more than 20 years of development, many kinds of superconducting qubits have been produced, which have a variety of names. However, all the changes are the same. In fact, they adjust the capacitance, inductance, Josephson junction inductance, loop flux and other parameters in the circuit, so as to construct different energy level structures.
A) the general equivalent circuit of superconducting qubits; B) the typical energy level structure; C) the distribution of many kinds of superconducting qubits in the parameter space. According to the selection of parameters, the early superconducting qubits can be roughly divided into three categories: charge qubits, also known as Cooper pair boxes, magnetic flux qubits and phase qubits. These three qubits are all plagued by different noises, resulting in a very short decoherence time. After a lot of research, physicists have finally figured out these noise sources, including charge fluctuations, flux fluctuations, quasiparticle noise and so on. After finding out the source of the noise, we can find a way to suppress the noise, thus greatly increasing the decoherence time.
The most popular Transmon or Xmon qubit is a typical case of greatly increasing the decoherence time by suppressing the charge fluctuation in the charge qubit. The full name of Tansmon is "plasma oscillating qubits in transmission line bypass". It is a mouthful to read, but it is actually adding a large bypass capacitance to the charge qubit. The addition of this capacitor greatly smoothes the charge dispersion relation, that is to say, now the energy of the system becomes very insensitive to the charge fluctuation, which naturally suppresses the charge noise. Of course, this comes at a cost-the nonlinearity of the system has fallen sharply. However, this decline is tolerable compared to the benefits of decoherence time. Another major innovation of Transmon is that the gate capacitance originally used to control qubits is replaced by dispersion coupling with a transmission line resonator. This design provides a very useful measurement method for qubits-"quantum non-destructive measurement", which is very important for quantum error correction. Let's not talk about it here. (for quantum error correction, see "the next Super Challenge of Quantum Computing")
Xmon is actually replacing a capacitive plate in Transmon with the ground, which is essentially the same as Transmon. Xmon is easy to expand. Martinis's team pioneered the Xmon structure with five qubits, which soon attracted the attention of the Google Quantum AI team, and then the whole team joined Google, and then a few years later there was a great cause of "quantum hegemony."
At present, the most popular transmon / Xmon qubits.
A picture of a multi-bit superconducting quantum chip shows different circuit elements in different colors, including qubits, readout resonators, readout bus transmission lines, and so on. As mentioned earlier, superconducting qubits have developed a variety of structures after years of development. In fact, in this process, scientists have gradually solved the problems of scalability, decoherence, gate manipulation and readout, which are mentioned in the previous Devincenzo criteria that must be solved. In 2013, two famous quantum computing experts, Yale University professors Devoret and Schoelkopf, wrote an outlook, giving a roadmap for the development of general-purpose quantum computing, in which superconducting quantum computing is already at the level of the third to fourth stages, while many other schemes are still in the first or second stages. In the same year, MIT's team drew a picture showing the "Moore's Law" of the decoherence time of superconducting qubits, rising from less than 3 nanoseconds to the current level of 300 microseconds. In less than 20 years, it has increased by five orders of magnitude, which shows the rapid development of this field. Now, several famous technology companies, including Google, IBM, Intel and so on, have participated in the research and development of quantum computing, and they all choose the scheme of superconductivity. It can be said that superconducting quantum computing has become one of the most promising quantum computing solutions at present.
5. How far is quantum computing from us? In October 2019, Google published a paper on Science introducing their experimental results of verifying quantum hegemony. The experiment excited the world, and Google's CEO Pichai commented on the significance of the experiment, just as the Wright brothers invented the first airplane. However, just as the first civilian airliner will not be born until more than 50 years later, there is still a long way to go for quantum computers to enter thousands of households.
I found a picture of Mount Qomolangma as an analogy. If we compare the construction of a general-purpose quantum computer to the summit of Mount Everest, we still have two mountains to climb before we reach the summit. The first is to construct the logical qubits of quantum error correction, and then to realize all the general quantum gate operations on the logical qubits. In front of these two mountains, quantum hegemony is actually just a small slope. However, do not underestimate this small soil slope, its elevation is actually more than 5,000 meters.
Let's take a look at a more quantitative picture.
The green area on the graph is the hardware requirement for error-correcting general-purpose quantum computers. Dotted lines are the error rate threshold for surface coded quantum error correction, and we have just exceeded this threshold; quantum circuits within about 50 qubits can be simulated by classical computers, and we have just exceeded this threshold. How far away is it from the universal quantum computer? In terms of the number of bits, we still have a gap of more than 5 orders of magnitude. Maybe many people have no idea how big the gap is, so let's make an estimate based on the integration speed of semiconductor transistors in the past. The famous Moore's Law shows that the number of transistors has been doubling about every two years for decades. If this law is applied to the development of qubits, how long will it take to increase by five orders of magnitude? It will take 26 years (I just retired that year when I pinched the numbers). Can we continue to take a wait-and-see attitude towards quantum computing? No, not really. We can still do a lot of things in the huge gap of 50 to 1 million. Now, academia and industry are actively looking for recent applications to see how to develop applications with real economic benefits on medium-sized noisy quantum computers. For example, Google recently simulated the ground state solution of some simple molecules on the chip that realized quantum hegemony, and explored the possibility of using quantum chip to accelerate the calculation of chemical molecules. They even began to work with pharmaceutical companies to explore how to use quantum computing to accelerate the development of new drugs.
6. Conclusion technically, we not only need to overcome various problems caused by the increase in the number of qubits, but also need to improve the decoherence time of each qubit and the fidelity of manipulation. In other words, quantum resources are actually expanded from two dimensions, the former one is spatial, and the latter is temporal. We need to constantly solve the problem of wiring and crosstalk caused by the increase in the number of bits. Google successfully solves the qubit layout problem of two-dimensional array arrangement by using flip chip technology on Sycamore chip. In addition, the lead and packaging of microwave is also a very difficult problem, because the manipulation and reading of qubits need microwave, but now the volume of microwave devices is so large that it is almost impossible to do large-scale packaging at extremely low temperature. we must find a way to miniaturize the microwave circuit, and even some microwave devices have to go directly to the chip. In addition, the integration expansion of electronics, cross-chip quantum state transfer and entanglement and other technologies are very challenging.
It is very difficult, but there is also a lot of hope. Optimists think that quantum computers are developing much faster than I just estimated. Google, for example, has given their ten-year road map. So far, they have achieved their first small goal, quantum hegemony, and by 2029, they plan to make more than 1 million qubits, quantum error-correcting quantum computers. During this period of time, they will continue to explore the applications of medium-scale noisy quantum computers, including quantum simulation, optimization, sampling, quantum artificial intelligence and so on. In other words, perhaps in three to five years, quantum computing has begun to change the way we produce and live. Of course, this is only an optimistic estimate, but in any case, quantum computing contains infinite possibilities, waiting for us to develop!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.