Machine learning and physics
机器学习与物理专题编者按
DOI: 10.7498/aps.70.140101
机器学习, 尤其是深度学习, 在很多方面取得了令人瞩目的成就, 是当前科学技术领域最为热门、发展最快的方向之一. 其与物理的结合是最近几年新兴的交叉前沿领域, 受到了广泛关注. 一方面, 运用机器学习的方法可以解决一些复杂的、传统方法很难或无法解决的物理问题; 另一方面, 物理中的一些概念、理论和方法也可以用于研究机器学习. 二者的交叉融通带来了新的机遇与挑战,将极大地促进两个领域的发展.
本专题邀请了若干活跃在该新兴领域的专家撰稿, 重点介绍机器学习与物理交叉方向的部分国际前沿课题和最新研究进展. 内容涵盖了量子人工智能中的对抗学习, 量子生成模型, 基于波动与扩散的机器学习, 自动微分, 绝热量子算法设计, 量子机器学习中的编码与初态制备, 以及基于自旋体系的量子机器学习实验进展等.
希望本专题能够帮助读者了解机器学习与物理交叉方向的研究内容, 基本思想与方法, 最新进展情况, 以及面临的挑战与机遇. 同时, 也希望这个专题能够激发读者的兴趣, 吸引更多的研究人员加入到此交叉领域的研究中.
DOI: 10.7498/aps.70.140101
机器学习, 尤其是深度学习, 在很多方面取得了令人瞩目的成就, 是当前科学技术领域最为热门、发展最快的方向之一. 其与物理的结合是最近几年新兴的交叉前沿领域, 受到了广泛关注. 一方面, 运用机器学习的方法可以解决一些复杂的、传统方法很难或无法解决的物理问题; 另一方面, 物理中的一些概念、理论和方法也可以用于研究机器学习. 二者的交叉融通带来了新的机遇与挑战,将极大地促进两个领域的发展.
本专题邀请了若干活跃在该新兴领域的专家撰稿, 重点介绍机器学习与物理交叉方向的部分国际前沿课题和最新研究进展. 内容涵盖了量子人工智能中的对抗学习, 量子生成模型, 基于波动与扩散的机器学习, 自动微分, 绝热量子算法设计, 量子机器学习中的编码与初态制备, 以及基于自旋体系的量子机器学习实验进展等.
希望本专题能够帮助读者了解机器学习与物理交叉方向的研究内容, 基本思想与方法, 最新进展情况, 以及面临的挑战与机遇. 同时, 也希望这个专题能够激发读者的兴趣, 吸引更多的研究人员加入到此交叉领域的研究中.

2021, 70 (14): 144204.
doi: 10.7498/aps.70.20210879
Abstract +
Recently, the application of physics to machine learning and the interdisciplinary convergence of the two have attracted wide attention. This paper focuses on exploring the internal relationship between physical systems and machine learning, and also on promoting machine learning algorithm and physical implementation. We summarize the researches of machine learning in wave systems and diffusion systems, and introduce some of the latest research results. We first discuss the realization of supervised learning for wave systems, including the wave optics realization of neural networks, the wave realization of quantum search, the recurrent neural networks based on wave systems, and the nonlinear wave computation of neural morphology. Then, we discuss the machine learning algorithms inspired by diffusion systems, such as the classification algorithm based on diffusion dynamics, data mining and information filtering based on thermal diffusion, searching for optimization based on population diffusion, etc. The physical mechanism of diffusion system can inspire the construction of efficient machine learning algorithms for the classification and optimization of complex systems and physics research, which may create a new vision for the development of physics inspired algorithms and hardware implementation, and even the integration of software and hardware.

2021, 70 (14): 140304.
doi: 10.7498/aps.70.20210930
Abstract +
In recent years, many generation-based machine learning algorithms such as generative adversarial networks, Boltzmann machine, auto-encoder, etc. are widely used in data generation and probability distribution simulation. On the other hand, the combined algorithms of quantum computation and classical machine learning algorithms are proposed in various styles. Especially, there exist many relevant researches about quantum generative models, which are regarded as the branch of quantum machine learning. Quantum generative models are hybrid quantum-classical algorithms, in which parameterized quantum circuits are introduced to obtain the cost function of the task as well as its gradient, and then classical optimization algorithms are used to find the optima. Compared with its classical counterpart, quantum generative models map the data stream to high-dimensional Hilbert space with parameterized quantum circuits. In the mapping space, data features are easier to learn, which can surpass classical generative models in some tasks. Besides, quantum generative models are potential to realize the quantum advantage in noisy intermediate-scale quantum devices.

2021, 70 (14): 140306.
doi: 10.7498/aps.70.20210831
Abstract +
Quantum computing has made dramatic progress in the last decade. The quantum platforms including superconducting qubits, photonic devices, and atomic ensembles, have all reached a new era, with unprecedented quantum control capability developed. Quantum computation advantage over classical computers has been reported on certain computation tasks. A promising computing protocol of using the computation power in these controllable quantum devices is implemented through quantum adiabatic computing, where quantum algorithm design plays an essential role in fully using the quantum advantage. Here in this paper, we review recent developments in using machine learning approach to design the quantum adiabatic algorithm. Its applications to 3-SAT problems, and also the Grover search problems are discussed.

2021, 70 (14): 149402.
doi: 10.7498/aps.70.20210813
Abstract +
Automatic differentiation is a technology to differentiate a computer program automatically. It is known to many people for its use in machine learning in recent decades. Nowadays, researchers are becoming increasingly aware of its importance in scientific computing, especially in the physics simulation. Differentiating physics simulation can help us solve many important issues in chaos theory, electromagnetism, seismic and oceanographic. Meanwhile, it is also challenging because these applications often require a lot of computing time and space. This paper will review several automatic differentiation strategies for physics simulation, and compare their pros and cons. These methods include adjoint state methods, forward mode automatic differentiation, reverse mode automatic differentiation, and reversible programming automatic differentiation.

2021, 70 (14): 140307.
doi: 10.7498/aps.70.20210958
Abstract +
The development of traditional classic computers relies on the transistor structure of microchips, which develops in accordance with Moore's Law. In the future, as the distance between transistors approaches to the physical limit of manufacturing process, the development of computation capability of classical computers will encounter a bottleneck. On the other hand, with the development of machine learning, the demand for computation capability of computer is growing rapidly, and the contradiction between computation capability and demand for computers is becoming increasingly prominent. As a new computing model, quantum computing is significantly faster than classical computing for some specific problems, so, sufficient computation capability for machine learning is expected. When using quantum computing to deal with machine learning tasks, the first basic problem is how to represent the classical data effectively in the quantum system. This problem is called the state preparation problem. In this paper, the relevant researches of state preparation are reviewed, various state preparation schemes proposed at present are introduced, the processes of realizing these schemes are described, and the complexities of these schemes are summarized and analyzed. Finally, some prospects of the research work in the direction of state preparation are also presented.

2021, 70 (14): 140305.
doi: 10.7498/aps.70.20210684
Abstract +
Machine learning is widely applied in various areas due to its advantages in pattern recognition, but it is severely restricted by the computing power of classic computers. In recent years, with the rapid development of quantum technology, quantum machine learning has been verified experimentally verified in many quantum systems, and exhibited great advantages over classical algorithms for certain specific problems. In the present review, we mainly introduce two typical spin systems, nuclear magnetic resonance and nitrogen-vacancy centers in diamond, and review some representative experiments in the field of quantum machine learning, which were carried out in recent years.

2021, 70 (14): 140302.
doi: 10.7498/aps.70.20210789
Abstract +
Quantum artificial intelligence exploits the interplay between artificial intelligence and quantum physics: on the one hand, a plethora of tools and ideas from artificial intelligence can be adopted to tackle intricate quantum problems; on the other hand, quantum computing could also bring unprecedented opportunities to enhance, speed up, or innovate artificial intelligence. Yet, quantum learning systems, similar to classical ones, may also suffer adversarial attacks: adding a tiny carefully-crafted perturbation to the legitimate input data would cause the systems to make incorrect predictions at a notably high confidence level. In this paper, we introduce the basic concepts and ideas of classical and quantum adversarial learning, as well as some recent advances along this line. First, we introduce the basics of both classical and quantum adversarial learning. Through concrete examples, involving classifications of phases of two-dimensional Ising model and three-dimensional chiral topological insulators, we reveal the vulnerability of classical machine learning phases of matter. In addition, we demonstrate the vulnerability of quantum classifiers with the example of classifying hand-written digit images. We theoretically elucidate the celebrated no free lunch theorem from the classical and quantum perspectives, and discuss the universality properties of adversarial attacks in quantum classifiers. Finally, we discuss the possible defense strategies. The study of adversarial learning in quantum artificial intelligence uncovers notable potential risks for quantum intelligence systems, which would have far-reaching consequences for the future interactions between the two areas.