School of Mathematical and Statistical Sciences Faculty Publications and Presentations
Document Type
Article
Publication Date
5-10-2023
Abstract
Understanding how deep learning architectures work is a central scientific problem. Recently, a correspondence between neural networks (NNs) and Euclidean quantum field theories has been proposed. This work investigates this correspondence in the framework of p-adic statistical field theories (SFTs) and neural networks. In this case, the fields are real-valued functions defined on an infinite regular rooted tree with valence p, a fixed prime number. This infinite tree provides the topology for a continuous deep Boltzmann machine (DBM), which is identified with a statistical field theory on this infinite tree. In the p-adic framework, there is a natural method to discretize SFTs. Each discrete SFT corresponds to a Boltzmann machine with a tree-like topology. This method allows us to recover the standard DBMs and gives new convolutional DBMs. The new networks use O(N) parameters while the classical ones use O(N2) parameters.
Recommended Citation
Zúñiga-Galindo, Wilson A., Cuiyu He, and B. A. Zambrano-Luna. "p-Adic statistical field theory and convolutional deep Boltzmann machines." Progress of Theoretical and Experimental Physics 2023, no. 6 (2023): 063A01. https://doi.org/10.1093/ptep/ptad061
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.
Publication Title
Progress of Theoretical and Experimental Physics
DOI
https://doi.org/10.1093/ptep/ptad061
Comments
© The Author(s) 2023.