In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
AI, Big Data, Cloud and IoT are just a few of the terms that we can't get around in our technology landscape describing the second decade of the 21st century. It is difficult to begin an essay on technological trends without opening with these terms.
In the next decade, quantum computing may be added to this long list of revolutionary technologies. Just like artificial intelligence, neural network algorithms mature because of the feeding of big data, because cloud computing has abundant computing power, and because of the integration with the Internet of Things, everything generates intelligence.
So, what kind of "violent reaction" can the interaction between artificial intelligence and quantum computing produce?
In 2017, Turing Prize winner Professor Yao Qizhi said in a speech: "If we can put quantum computing and AI together, we may do things that even nature did not think of." "
The technical prospect of "dare to call the world a new face" may be a little far away for most ordinary people, but the technical research of quantum computing and quantum machine learning has already blossomed in the laboratories of many scientific research institutions and technology giants around the world, which can provide basic algorithm tools and resources to the outside world and technicians, so that the public can see the magical power of quantum and intelligence.
Recently, Baidu Feijiang official announced the release of quantum machine learning development tool Paddle Quantum, becoming the only deep learning platform supporting quantum machine learning in China. The function of the quantum paddle is to provide a set of tools for quantum machine learning, open to researchers, to build and test quantum neural networks and to study quantum artificial intelligence.
The appearance of quantum paddle, how much significant progress can be brought to quantum machine learning, remains to be tested by time, but to a certain extent, it promotes the popularization of quantum machine learning in China and provides a new learning path for the majority of AI and quantum computing researchers.
You can't help learning enthusiasm, you must think of the intersection of quantum computing and artificial intelligence technology, a glimpse of quantum machine learning, to measure the possibility of joining the game?
Why is "quantum computing" very compatible with "artificial intelligence"?
Before we get to the point, we might as well briefly understand the basic background of quantum computing to reduce the difficulty of substitution.
Let's see why quantum computing is possible.
In the twentieth century, one of the great physical discoveries of nature was quantum mechanics, and the main discovery of quantum mechanics was that elementary particles have two states-superposition and entanglement. In common terms, superposition means that quantum is both this and that at the same time. Once observed or measured, it will become one of them. This is the famous "uncertainty"; Entanglement is two pairs of quantum particles, even if they are separated from each other at both ends of the universe, can also have a mysterious interaction of dark stamps. This is the famous "quantum entanglement."
"Superposition" determines the basis of quantum parallel computation, and "entanglement" determines the basis of quantum transmission. When these properties of quanta are applied to calculations, they can be used to handle very complex data calculations.
We know that the basic unit of classical computation is a bit, which can only be switched between two states: 0 and 1. Classical calculations can only be linear calculations in the opening and closing of 0 and 1, but the current computing power is very large, and billions or more calculations can be performed in one second.
The qubit, the basic unit of quantum computing, can have the characteristics of 0 and 1 at the same time because of the superposition effect. As the number of qubits increases, the computational power of qubits increases exponentially.
That is, a qubit can be in two states (0 and 1) simultaneously. Thus, two interacting qubits can store all four binary values simultaneously. In general,"n" qubits can simultaneously represent "2 to the nth power" classical binary values.
To understand the difference between classical computing and quantum computing, let us use the example of a cornfield maze: classical computing is like a person looking for an exit in a cornfield. The person will first find a way down, return when he meets obstacles, find another way to start again, and return when he meets obstacles until he finds the exit. Quantum computing, on the other hand, is like having multiple avatars, exploring every path in the corn maze at the same time and finding the exit at once.
In this way, the high parallel computing ability composed of superposition state of quantum computing and superposition collapse formed by entangled state provides a new possibility for data processing and algorithm training required by artificial intelligence, especially machine learning.
Quantum computing and artificial intelligence are also having an "entangled" and inseparable relationship, but this does not mean that quantum computing plus machine learning can immediately produce significant results.
Quantum machine learning is still in its infancy
Quantum machine learning (Quantum ML) is an interdisciplinary technology field where quantum computing and machine learning intersect, and the combination of the two can produce mutually beneficial results.
On the one hand, one of the most important goals of quantum computing is to develop high-performance quantum machine learning algorithms with the help of quantum characteristics, thus accelerating or broadening the application scenarios of artificial intelligence. On the other hand, quantum computing still has many very difficult scientific and engineering problems to be solved before large-scale application, which requires a large number of advanced computing tools, especially AI technology to help break through the bottleneck of quantum computing research and development.
Under the two-dimensional division of machine learning and quantum computing according to algorithms and data, four different classifications can be obtained-C-C, Q-C, C-Q and Q-Q. C-C is traditional machine learning;Q-Q belongs to the open domain of quantum computing; and C-Q mainly solves quantum physics problems through machine learning algorithms, such as modeling the control object of quantum systems, identifying parameters such as disturbances and noise, and promoting the development of quantum computing.
Q-C is to use quantum theory to improve machine learning and promote the quantization of machine learning algorithms. One method is to turn the original non-computable problems in classical computing into computable ones through quantum computing, thus greatly reducing the computational complexity of machine learning algorithms; another method is to combine the advantages of quantum computing parallel acceleration with the depth of machine learning algorithms to give birth to a brand new quantum machine learning algorithm model.
Traditional neural networks can only use a single network to store many algorithmic patterns; quantum neural networks, because of the parallelism caused by quantum superposition effects, can use many networks to store many algorithmic patterns. However, the implementation of quantum neural networks is not easy, because ultimately these algorithms need the support of quantum computers (processors) to really work.
According to reports, in 2018, a research team at the University of Pavia in Italy realized the world's first single-layer neural network on a quantum computer with only four qubits.
(Quantum circuit with artificial neurons of 4 qubits)
This model accurately simulates the behavior of individual neurons, and single-layer models like this one can recognize simple patterns. However, it has not been extended to deep neural networks composed of multiple layers of neurons. However, this is at least the first step towards effectively training quantum neural networks on quantum hardware.
Compared with traditional neural networks, quantum neural networks have many advantages, such as exponential memory capacity, faster learning and processing speed, smaller scale, and higher stability and reliability.
Although progress on quantum hardware is a bit slower, algorithmic model theory can get ahead. Google's Quantum AI team had already constructed a theoretical model of a deep neural network that could be trained on a quantum computer.
And in March of this year, Google announced that TensorFlow Quantum(TFQ), a machine learning library for training quantum models, was open source. TFQ contains the basic structures required for a particular quantum computation, such as qubits, gates, circuits, and measurement operators. User-specified quantum calculations can then be performed on simulated or real hardware.
At present, the development of quantum machine learning is still in its infancy, and some current applications can use quantum neural networks to generate new instruments that play completely new sounds.
Future applications of quantum machine learning are even more exciting. For example, quantum neural networks with exponential storage and retrieval capabilities can simulate human brains or simulate black holes, helping humans truly explore the deepest nature of the world. This may be the ultimate value of quantum machine learning and quantum computing field.
The growth "difficulty" of quantum machine learning
Quantum computing itself is a complex technology, and the difficulty of developing quantum machine learning, a cross-technology, is naturally further increased. The maturity of quantum machine learning algorithms naturally benefits from the simultaneous improvement of both software and hardware, and there are still some difficulties to overcome in both sides.
First of all, we need to know that traditional machine learning languages cannot be directly transplanted to quantum computing, but need to convert the current machine learning code into quantum states using qubits to build quantum neural networks.
This is the so-called I/O bottleneck of quantum machine learning. The so-called I/O bottleneck refers to that most quantum machine learning algorithms at present either need to encode large-scale data sets into quantum states, or only generate solutions to problems in quantum states, so the pre-processing in the input stage and the post-processing in the information extraction stage will consume a lot of time, even offset the time saved by quantum algorithms.
(IBM Central structure of a 50-qubit computing system)
Secondly, the real universal quantum computer has not yet appeared, and the quantum computer with thousands of qubits in the present sense still has certain problems in resisting noise and solving decoherence problems, that is, it cannot reach the standard of Divincenzo. This means that the hardware means to support the actual verification of quantum machine learning algorithms are still lacking, and most researchers can only realize the operation of multiple qubits through quantum simulators.
For example, Google TFQ provides a tool for quantum machine learning research that contains a noisy intermediate quantum processor (NISQ) of about 50 to 100 qubits to control/model natural or artificial quantum systems. Based on this, TFQ's quantum machine learning model can now handle both quantum data models and hybrid quantum-classical models, helping developers improve existing quantum algorithms or discover some new ones.
Based on the practical difficulties of software and hardware, breakthroughs in quantum machine learning algorithms still have a long time to go.
In addition, on some computational problems, it is still doubtful whether quantum machine learning algorithms must have acceleration advantages over classical algorithms. In 2018, Ewin Tang, an 18-year-old Chinese student, was inspired by the quantum recommendation algorithm and designed a classical algorithm that solved the recommendation problem at a speed similar to the quantum algorithm. This idea gives researchers inspiration: thinking through quantum algorithms can promote the development of classical algorithms, which is another embodiment of the significance of quantum computing research.
For now, though, quantum machine learning is neither as impressive as Google claimed last year, nor as widespread as AI in real life. But quantum machine learning is more like a computational product for the future world.
Decades ago, the realization of quantum computing and the realization of neural networks for machine learning were both considered impossible. And both of these can actually be achieved, and now it is still at the intersection of each other. It is already an adventure for human technology.
Then standing at the moment, an ordinary technology enthusiast can also carry out the development and testing of quantum algorithms by himself through the opening of quantum computing platforms such as Baidu and Google, not to mention the once-in-a-lifetime and lucky.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.