Statistical Field Theory for Neural Networks

Helias, Moritz, Dahmen, David

相關主題

商品描述

I. Introduction
II. Probabilities, moments, cumulantsA. Probabilities, observables, and momentsB. Transformation of random variablesC. CumulantsD. Connection between moments and cumulants
III. Gaussian distribution and Wick's theoremA. Gaussian distributionB. Moment and cumulant generating function of a GaussianC. Wick's theoremD. Graphical representation: Feynman diagramsE. Appendix: Self-adjoint operatorsF. Appendix: Normalization of a Gaussian
IV. Perturbation expansionA. General caseB. Special case of a Gaussian solvable theoryC. Example: Example: "phi 3 + phi 4" theoryD. External sourcesE. Cancellation of vacuum diagramsF. Equivalence of graphical rules for n-point correlation and n-th momentG. Example: "phi 3 + phi 4" theoryV. Linked cluster theoremA. General proof of the linked cluster theoremB. Dependence on j - external sources - two complimentary viewsC. Example: Connected diagrams of the "phi 3 + phi 4" theory
VI. Functional preliminariesA. Functional derivative1. Product rule2. Chain rule3. Special case of the chain rule: Fourier transformB. Functional Taylor series
VII. Functional formulation of stochastic differential equationsA. Onsager-Machlup path integral*B. Martin-Siggia-Rose-De Dominicis-Janssen (MSRDJ) path integralC. Moment generating functionalD. Response function in the MSRDJ formalism
VIII. Ornstein-Uhlenbeck process: The free Gaussian theoryA. DefinitionB. Propagators in time domainC. Propagators in Fourier domain
IX. Perturbation theory for stochastic differential equationsA. Vanishing moments of response fieldsB. Vanishing response loopsC. Feynman rules for SDEs in time domain and frequency domainD. Diagrams with more than a single external legE. Appendix: Unitary Fourier transform
X. Dynamic mean-field theory for random networksA. Definition of the model and generating functionalB. Property of self-averagingC. Average over the quenched disorderD. Stationary statistics: Self-consistent autocorrelation of as motion of a particle in a potentialE. Transition to chaosF. Assessing chaos by a pair of identical systemsG. Schrödinger equation for the maximum Lyapunov exponentH. Condition for transition to chaos
XI. Vertex generating functionA. Motivating example for the expansion around a non-vanishing mean valueB. Legendre transform and definition of the vertex generating function GammaC. Perturbation expansion of GammaD. Generalized one-line irreducibilityE. ExampleF. Vertex functions in the Gaussian caseG. Example: Vertex functions of the "phi 3 + phi 4"-theoryH. Appendix: Explicit cancellation until second orderI. Appendix: Convexity of WJ. Appendix: Legendre transform of a Gaussian
XII. Application: TAP approximationInverse problem
XIII. Expansion of cumulants into tree diagrams of vertex functionsA. Self-energy or mass operator Sigma
XIV. Loopwise expansion of the effective action - Tree levelA. Counting the number of loopsB. Loopwise expansion of the effective action - Higher numbers of loopsC. Example: phi 3 + phi 4-theoryD. Appendix: Equivalence of loopwise expansion and

作者簡介

Moritz Helias is group leader at the Jülich Research Centre and assistant professor in the department of physics of the RWTH Aachen University, Germany. He obtained his diploma in theoretical solid state physics at the University of Hamburg and his PhD in computational neuroscience at the University of Freiburg, Germany. Post-doctoral positions in RIKEN Wako-Shi, Japan and Jülich Research Center followed. His main research interests are neuronal network dynamics and function, and their quantitative analysis with tools from statistical physics and field theory.

David Dahmen is a post-doctoral researcher in the Institute of Neuroscience and Medicine at the Jülich Research Centre, Germany. He obtained his Master's degree in physics from RWTH Aachen University, Germany, working on effective field theory approaches to particle physics. Afterwards he moved to the field of computational neuroscience, where he received his PhD in 2017. His research comprises modeling, analysis and simulation of recurrent neuronal networks with special focus on development and knowledge transfer of mathematical tools and simulation concepts. His main interests are field-theoretic methods for random neural networks, correlations in recurrent networks, and modeling of the local field potential.