Ex) Article Title, Author, Keywords
New Phys.: Sae Mulli 2022; 72: 487-494
Published online July 31, 2022 https://doi.org/10.3938/NPSM.72.487
Copyright © New Physics: Sae Mulli.
Hyejin Kim*, Dongkyu Kim, Dong-Hee Kim†
Department of Physics and Photon Science, Gwangju Institute of Science and Technology, Gwangju 61005, Korea
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License(http://creativecommons.org/licenses/by-nc/3.0) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
We present a minimal neural network model to learn classifying the metallic and insulating phases from the real-frequency hybridization function computed in the dynamical mean-field theory for the repulsive Hubbard model at half-filling. The resulting neural network discriminates the phases essentially by reading the presence of the quasiparticle peak. The pattern observed in the weight matrix of neural connectivity allows us to write down a simple form of an indicator that can precisely detect the transition point only with the bath parameters building the Anderson impurity model. The proposed transition indicator is very sensitive to the emergence of zero energy orbital in the quantum bath. We demonstrate the accuracy of the indicator in the discrete bath description with a few orbitals for the exact diagonalization solver.
Keywords: Neural network, Machine learning, Metal-insulator transition, Dynamical mean-field theory
Machine learning with a neural network has attracted increasing attention in various fields of science, including condensed matter and statistical physics[1-5]. The neural network usually works as a black-box model where many hidden variables inside are trained with large input data samples to produce the outputs of desired predictions. The applicability of the data-driven approach has been extensively examined in various subjects, such as classifying phases of matter[6-12], accelerating numerical simulations[13-17], and approximating quantum wave functions[18-21]. On the other hand, dealing with the lack of transparency in the black-box model is another important direction of research. Understanding how the machine interprets the data and what particular information it extracts from the data can potentially help to gain physical insight from such data-driven predictions. For instance, previous works studied the physical justification of machine predictions[22-27], the characterization of the phases of matter[28-31], and the extraction of an order parameter for phase transitions[32-38]. In this paper, we attempt to interpret the machine learning of a metal-insulator transition trained with the data of the dynamical mean-field theory for the repulsive Hubbard model at the half-filling.
The dynamical mean-field theory (DMFT) maps a lattice model of the Hubbard Hamiltonian into the Anderson impurity model (AIM) with a quantum bath determined in a self-consistent way. DMFT provides an exact solution in the limit of infinite dimensions, having successfully described the phase diagram of a metal-insulator transition in the single-band repulsive Hubbard model. Several machine learning schemes have been applied to DMFT and AIM[40-43]. In particular, very high accuracy in classifying the metallic and insulating phases was reported in the previous supervised learning of the bath parameters representing the bare hybridization function. The previous work used the exact diagonalization to solve AIM. The machine classification was up to 99.6% accurate in the previous work, even though the exact diagonalization considers a few bath orbitals projecting a quantum bath in limited spectral resolution.
The metal-insulator transition in the infinite dimensions has been already well established in DMFT with the known indicators, such as the double occupancy and the emergence of the quasiparticle peak in the spectral gap. Performing machine learning for phase classification in such thoroughly studied phenomena may not be expected to reveal any new physics of the phenomena. However, from the point of view of machine learning, it can be still a good example system to study how the machine understands the phenomena. Our aim in this study is to open a black box to see inside how the data-driven predictions mimic the known physics of the phenomena. In particular, we want to see how the neural network extracts a relevant spectral feature to make sharp detection of a phase transition point and to propose a simple order-parameter-like quantity based on the observed mechanism of the machine prediction.
Our strategy is to downsize the hidden layer of the feed-forward neural network to find the minimal representation of machinery[26,27]. We employ the real-frequency hybridization function as a training dataset computed using the numerical renormalization group solver. It turns out that the phase classification does not lose its accuracy even when we decrease the size of the hidden layer to the minimum, which eventually becomes equivalent to the logistic regression. The pattern observed in the weight matrix of the neural connectivity indicates that the trained neural network mainly detects the existence of the quasiparticle peak in the spectral data given as an input. By analyzing the network output function, we find that a simple transition indicator can be written in terms of bath parameters, which is directly applicable to the discrete-orbital formulation of AIM for the exact diagonalization. The transition points identified by this simple indicator agree well with the phase diagram constructed using the conventional quantity of double occupancy. The indicator inspired by machine learning is very sensitive to the presence of zero energy orbital in the quantum bath, explaining its ability to discriminate the phases across the metal-insulator transition.
First, let us briefly describe the DMFT phase diagram for the half-filled single-band Hubbard model with repulsive interactions at infinite dimensions [39,44]. The Hamiltonian of the single-band Hubbard model can be written as
Local quantum fluctuations are fully considered in the conventional single-site DMFT by self-consistently mapping the lattice site of the Hubbard model into a local impurity surrounded by a quantum bath. The resulting single-site Anderson impurity model (AIM) is then written as follows:
The hybridization function works as the Weiss mean-field to construct the quantum bath of AIM, bridging the lattice model and the corresponding impurity model. In the Bethe lattices, the hybridization function
We prepare the training data set of
While the training data is prepared exclusively in the region of metallic and insulating phases, the test data set contains the coexistence area of the phase diagram. The test data set includes 229 samples representing the metal-to-insulator transition generated at equally spaced
We perform supervised learning for a feed-forward neural network model to discriminate the metallic and insulating phases with the prepared input dataset. We consider a simple network structure with a single hidden layer sketched in Fig. 2(a). The neural network receives the hybridization function
In the training process, the uniform initialization and the Xavier initialization are employed to initialize the unknowns. The objective function to be minimized is chosen to be the binary cross-entropy loss function, and the L2 regularization is used to prevent overfitting. The full batch gradient descent is used for the optimization. The learning rate and the regularization coefficient are set to be
To find the simplest possible structure of the neural network, we observe how the accuracy of the prediction changes with decreasing the number of neurons in the hidden layer. Figure 2(b) presents the test output
Because the weight matrix essentially governs the prediction if the presence of the hidden layer is not important, we analyze the feature of the neural connectivity in the weight matrix to understand the basis of the prediction made by the neural network. Figure 3 shows the pattern observed in the elements of the weight matrix with decreasing the size of the hidden layer, which becomes a simple function that may characterize the predicting power in the limit of the complete removal of the hidden layer. Figure 4 plots the weight matrix as a one-dimensional function of the frequency for the case of no hidden layer that gives the logistic regression.
In the logistic regression model with the weights in Fig. 4, the network output can be written simply by the inner production of two vectors as
where the simpler notation introduced as
The input data is at the logarithmic-scale grids in the frequency domain, and thus the density of data in the linear scale is proportional to
which becomes further simpler with the observed features in
Figure 4 presents
which we know is drastically different between the metallic and insulating phases. It is well known that the presence of the quasiparticle peak at zero frequency distinguishes the metallic phase from the insulating phase. Thus, the
omitting the neural network contribution
In the metallic phase,
Because the proposed indicator
We have investigated the supervised learning of the metal-insulator transition with the DMFT data of the hybridization function computed using the NRG solver. For the interpretation of the data-drive prediction, we have focused on a feed-forward neural network with a single hidden layer and attempted to decrease the size of the hidden layer to find a transparent minimal structure. It turns out that the accuracy of the transition point identification is not affected by the size of the hidden layer. The minimal structure can be constructed without the hidden layer, which becomes equivalent to the logistic regression model where the weight matrix governs the prediction. By analyzing the observed pattern of the weight matrix, we have found that the neural network mainly reads the presence of the quasiparticle peak, and this functionality can be implemented even without the complex structure of the hidden layer. Keeping the essence of the mathematical structure of the neural network output, we have proposed a simple indicator of the metal-insulator transition as a function of discrete bath parameters. The accuracy of the proposed indicator is numerically verified in the DMFT calculations with the ED solver in various lattices.