Ex) Article Title, Author, Keywords
New Phys.: Sae Mulli 2020; 70: 885-895
Published online October 30, 2020 https://doi.org/10.3938/NPSM.70.885
Copyright © New Physics: Sae Mulli.
Xue-Mei CUI1, Donghyeon GIM2, Seung Kee HAN3*
1Normal College, Yanbian University, Jilin 133002, China
Recently, artificial neural networks (ANNs) have been utilized for various tasks, including image classification, speech recognition, and machine translation, and for medical diagnosis. For practical applications, we use neural networks composed of a very large number of neurons, but we do not have much information on how to fix the number of hidden layers or the number of neurons in a layer. In this paper, a graphical illustration of neural learning in simple neural networks of a multilayer perceptron (MLP) is presented. In the case of XOR-like problems, the learning process corresponds to finding a line or surface that separates several states into two groups in a higher dimensional space. Here, we illustrate graphically how the bipartition problem depends on the number of neurons in a layer, we also address the meaning of adding a layer in the network. We expect that this intuitive graphical understanding of increasing the number of neurons or layers in simple neural networks will be useful in constructing neural networks for practical problems.
Keywords: Neural networks, Multilayer perceptron, Neural learning, Graphical illustration