語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
A Study of Field Theories Via Neural...
~
Maiti, Anindita,
FindBook
Google Book
Amazon
博客來
A Study of Field Theories Via Neural Networks /
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
A Study of Field Theories Via Neural Networks // Anindita Maiti.
作者:
Maiti, Anindita,
面頁冊數:
1 electronic resource (229 pages)
附註:
Source: Dissertations Abstracts International, Volume: 84-10, Section: B.
Contained By:
Dissertations Abstracts International84-10B.
標題:
Theoretical physics. -
電子資源:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30419826
ISBN:
9798379417994
A Study of Field Theories Via Neural Networks /
Maiti, Anindita,
A Study of Field Theories Via Neural Networks /
Anindita Maiti. - 1 electronic resource (229 pages)
Source: Dissertations Abstracts International, Volume: 84-10, Section: B.
We propose a theoretical understanding of neural networks in terms of Wilsonian effective field theory. The correspondence relies on the fact that many asymptotic neural networks are drawn from Gaussian processes, the analog of non-interacting field theories. Moving away from the asymptotic limit yields a non-Gaussian process and corresponds to turning on particle interactions, allowing for the computation of correlation functions of neural network outputs with Feynman diagrams. Minimal non-Gaussian process likelihoods are determined by the most relevant non-Gaussian terms, according to the flow in their coefficients induced by the Wilsonian renormalization group. This yields a direct connection between overparameterization and simplicity of neural network likelihoods. Whether the coefficients are constants or functions may be understood in terms of GP limit symmetries, as expected from 't Hooft's technical naturalness. General theoretical calculations are matched to neural network experiments in the simplest class of models allowing the correspondence. Our formalism is valid for any of the many architectures that becomes a GP in an asymptotic limit, a property preserved under certain types of training.Parameter-space and function-space provide two different duality frames in which to study neural networks. We demonstrate that symmetries of network densities may be determined via dual computations of network correlation functions, even when the density is unknown and the network is not equivariant. Symmetry-viaduality relies on invariance properties of the correlation functions, which stem from the choice of network parameter distributions. Input and output symmetries of neural network densities are determined, which recover known Gaussian process results in the infinite width limit. The mechanism may also be utilized to determine symmetries during training, when parameters are correlated, as well as symmetries of the Neural Tangent Kernel. We demonstrate that the amount of symmetry in the initialization density affects the accuracy of networks trained on Fashion-MNIST, and that symmetry breaking helps only when it is in the direction of ground truth.We study the origin of non-Gaussianities in neural network field densities, and demonstrate two distinct methods to constrain these systematically. As examples, we engineer a few nonperturbative neural network field distributions. Lastly, we demonstrate a measure for the locality of neural network actions, via cluster decomposition of connected correlation functions of network output ensembles.
English
ISBN: 9798379417994Subjects--Topical Terms:
2144760
Theoretical physics.
Subjects--Index Terms:
Field theory
A Study of Field Theories Via Neural Networks /
LDR
:03949nmm a22004093i 4500
001
2400425
005
20250522084123.5
006
m o d
007
cr|nu||||||||
008
251215s2023 miu||||||m |||||||eng d
020
$a
9798379417994
035
$a
(MiAaPQD)AAI30419826
035
$a
AAI30419826
040
$a
MiAaPQD
$b
eng
$c
MiAaPQD
$e
rda
100
1
$a
Maiti, Anindita,
$e
author.
$3
3770411
245
1 2
$a
A Study of Field Theories Via Neural Networks /
$c
Anindita Maiti.
264
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2023
300
$a
1 electronic resource (229 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 84-10, Section: B.
500
$a
Advisors: Halverson, James Committee members: Nelson, Brent; Ruehle, Fabian; Bao, Ning.
502
$b
Ph.D.
$c
Northeastern University
$d
2023.
520
$a
We propose a theoretical understanding of neural networks in terms of Wilsonian effective field theory. The correspondence relies on the fact that many asymptotic neural networks are drawn from Gaussian processes, the analog of non-interacting field theories. Moving away from the asymptotic limit yields a non-Gaussian process and corresponds to turning on particle interactions, allowing for the computation of correlation functions of neural network outputs with Feynman diagrams. Minimal non-Gaussian process likelihoods are determined by the most relevant non-Gaussian terms, according to the flow in their coefficients induced by the Wilsonian renormalization group. This yields a direct connection between overparameterization and simplicity of neural network likelihoods. Whether the coefficients are constants or functions may be understood in terms of GP limit symmetries, as expected from 't Hooft's technical naturalness. General theoretical calculations are matched to neural network experiments in the simplest class of models allowing the correspondence. Our formalism is valid for any of the many architectures that becomes a GP in an asymptotic limit, a property preserved under certain types of training.Parameter-space and function-space provide two different duality frames in which to study neural networks. We demonstrate that symmetries of network densities may be determined via dual computations of network correlation functions, even when the density is unknown and the network is not equivariant. Symmetry-viaduality relies on invariance properties of the correlation functions, which stem from the choice of network parameter distributions. Input and output symmetries of neural network densities are determined, which recover known Gaussian process results in the infinite width limit. The mechanism may also be utilized to determine symmetries during training, when parameters are correlated, as well as symmetries of the Neural Tangent Kernel. We demonstrate that the amount of symmetry in the initialization density affects the accuracy of networks trained on Fashion-MNIST, and that symmetry breaking helps only when it is in the direction of ground truth.We study the origin of non-Gaussianities in neural network field densities, and demonstrate two distinct methods to constrain these systematically. As examples, we engineer a few nonperturbative neural network field distributions. Lastly, we demonstrate a measure for the locality of neural network actions, via cluster decomposition of connected correlation functions of network output ensembles.
546
$a
English
590
$a
School code: 0160
650
4
$a
Theoretical physics.
$3
2144760
653
$a
Field theory
653
$a
Machine learning theory
653
$a
Neural networks
653
$a
Wilsonian field theory
690
$a
0753
690
$a
0800
710
2
$a
Northeastern University.
$b
Physics.
$e
degree granting institution.
$3
3770412
720
1
$a
Halverson, James
$e
degree supervisor.
773
0
$t
Dissertations Abstracts International
$g
84-10B.
790
$a
0160
791
$a
Ph.D.
792
$a
2023
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30419826
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9508745
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入