Abstract
The harnessing of modern computational abilities for many-body wave-function
representations is naturally placed as a prominent avenue in contemporary
condensed matter physics. Specifically, highly expressive computational schemes
that are able to efficiently represent the entanglement properties of
many-particle systems are of interest. In the seemingly unrelated field of
machine learning, deep network architectures have exhibited an unprecedented
ability to tractably encompass the dependencies characterizing hard learning
tasks such as image classification. However, key questions regarding deep
learning architecture design still have no adequate theoretical answers. In
this paper, we establish a Tensor Network (TN) based common language between
the two disciplines, which allows us to offer bidirectional contributions. By
showing that many-body wave-functions are structurally equivalent to mappings
of ConvACs and RACs, we construct their TN equivalents, and suggest quantum
entanglement measures as natural quantifiers of dependencies in such networks.
Accordingly, we propose a novel entanglement based deep learning design scheme.
In the other direction, we identify that an inherent re-use of information in
state-of-the-art deep learning architectures is a key trait that distinguishes
them from standard TNs. Therefore, we employ a TN manifestation of information
re-use and construct TNs corresponding to powerful architectures such as deep
recurrent and overlapping convolutional networks. This allows us to demonstrate
that the entanglement scaling supported by state-of-the-art deep learning
architectures matches that of MERA TN in 1D, and that they support volume law
entanglement in 2D polynomially more efficiently than RBMs. We thus provide
theoretical motivation to shift trending neural-network based wave-function
representations closer to state-of-the-art deep learning architectures.
Users
Please
log in to take part in the discussion (add own reviews or comments).