The world as a neural network

Vitaly Vanchurin

Research output: Contribution to journalArticlepeer-review

Abstract

We discuss a possibility that the entire universe on its most fundamental level is a neural network. We identify two different types of dynamical degrees of freedom: ‘trainable’ variables (e.g., bias vector or weight matrix) and ‘hidden’ variables (e.g., state vector of neurons). We first consider stochastic evolution of the trainable variables to argue that near equilibrium their dynamics is well approximated by Madelung equations (with free energy representing the phase) and further away from the equilibrium by Hamilton–Jacobi equations (with free energy representing the Hamilton’s principal function). This shows that the trainable variables can indeed exhibit classical and quantum behaviors with the state vector of neurons representing the hidden variables. We then study stochastic evolution of the hidden variables by considering D non-interacting subsystems with average state vectors, ¯x1, …, ¯xD and an overall average state vector ¯x0. In the limit when the weight matrix is a permutation matrix, the dynamics of ¯xm can be described in terms of relativistic strings in an emergent D + 1 dimensional Minkowski space-time. If the subsystems are minimally interacting, with interactions that are described by a metric tensor, and then the emergent space-time becomes curved. We argue that the entropy production in such a system is a local function of the metric tensor which should be determined by the symmetries of the Onsager tensor. It turns out that a very simple and highly symmetric Onsager tensor leads to the entropy production described by the Einstein–Hilbert term. This shows that the learning dynamics of a neural network can indeed exhibit approximate behaviors that were described by both quantum mechanics and general relativity. We also discuss a possibility that the two descriptions are holographic duals of each other.

Original languageEnglish (US)
Article number1210
Pages (from-to)1-20
Number of pages20
JournalEntropy
Volume22
Issue number11
DOIs
StatePublished - Nov 2020

Bibliographical note

Funding Information:
Funding: This research received no external funding. Acknowledgments: This work was supported in part by the Foundational Questions Institute (FQXi). Conflicts of Interest: The author declares no conflict of interest.

Publisher Copyright:
© 2020 by the authors. Licensee MDPI, Basel, Switzerland.

Keywords

  • General relativity
  • Machine learning
  • Quantum mechanics
  • Thermodynamics of learning

PubMed: MeSH publication types

  • Journal Article

Fingerprint Dive into the research topics of 'The world as a neural network'. Together they form a unique fingerprint.

Cite this