Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
to Search results for
[ null ]
Switch To:
Labeled
|
MARC Mode
|
ISBD
Information theory and learning: A ...
~
Nemenman, Ilya Mark.
Linked to FindBook
Google Book
Amazon
博客來
Information theory and learning: A physical approach.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Information theory and learning: A physical approach./
Author:
Nemenman, Ilya Mark.
Description:
131 p.
Notes:
Source: Dissertation Abstracts International, Volume: 61-08, Section: B, page: 4202.
Contained By:
Dissertation Abstracts International61-08B.
Subject:
Physics, General. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=9981559
ISBN:
0599880384
Information theory and learning: A physical approach.
Nemenman, Ilya Mark.
Information theory and learning: A physical approach.
- 131 p.
Source: Dissertation Abstracts International, Volume: 61-08, Section: B, page: 4202.
Thesis (Ph.D.)--Princeton University, 2000.
We try to establish a unified information theoretic approach to learning and to explore some of its applications. First, we define predictive information as the mutual information between the past and the future of a time series, discuss its behavior as a function of the length of the series, and explain how other quantities of interest studied previously in learning theory---as well as in dynamical systems and statistical mechanics---emerge from this universally definable concept. We then prove that predictive information provides the unique measure for the complexity of dynamics underlying the time series and show that there are classes of models characterized by power-law growth of the predictive information that are qualitatively more complex than any of the systems that have been investigated before. Further, we investigate numerically the learning of a nonparametric probability density, which is an example of a problem with power-law complexity, and show that the proper Bayesian formulation of this problem provides for the 'Occam' factors that punish overly complex models and thus allow one to learn not only a solution within a specific model class, but also the class itself using the data only and with very few a priori assumptions. We study a possible information theoretic method that regularizes the learning of an undersampled discrete variable, and show that learning in such a setup goes through stages of very different complexities. Finally, we discuss how all of these ideas may be useful in various problems in physics, statistics, and, most importantly, biology.
ISBN: 0599880384Subjects--Topical Terms:
1018488
Physics, General.
Information theory and learning: A physical approach.
LDR
:02461nmm 2200277 4500
001
1857414
005
20041123145129.5
008
130614s2000 eng d
020
$a
0599880384
035
$a
(UnM)AAI9981559
035
$a
AAI9981559
040
$a
UnM
$c
UnM
100
1
$a
Nemenman, Ilya Mark.
$3
1945133
245
1 0
$a
Information theory and learning: A physical approach.
300
$a
131 p.
500
$a
Source: Dissertation Abstracts International, Volume: 61-08, Section: B, page: 4202.
500
$a
Adviser: William Bialek.
502
$a
Thesis (Ph.D.)--Princeton University, 2000.
520
$a
We try to establish a unified information theoretic approach to learning and to explore some of its applications. First, we define predictive information as the mutual information between the past and the future of a time series, discuss its behavior as a function of the length of the series, and explain how other quantities of interest studied previously in learning theory---as well as in dynamical systems and statistical mechanics---emerge from this universally definable concept. We then prove that predictive information provides the unique measure for the complexity of dynamics underlying the time series and show that there are classes of models characterized by power-law growth of the predictive information that are qualitatively more complex than any of the systems that have been investigated before. Further, we investigate numerically the learning of a nonparametric probability density, which is an example of a problem with power-law complexity, and show that the proper Bayesian formulation of this problem provides for the 'Occam' factors that punish overly complex models and thus allow one to learn not only a solution within a specific model class, but also the class itself using the data only and with very few a priori assumptions. We study a possible information theoretic method that regularizes the learning of an undersampled discrete variable, and show that learning in such a setup goes through stages of very different complexities. Finally, we discuss how all of these ideas may be useful in various problems in physics, statistics, and, most importantly, biology.
590
$a
School code: 0181.
650
4
$a
Physics, General.
$3
1018488
650
4
$a
Statistics.
$3
517247
690
$a
0605
690
$a
0463
710
2 0
$a
Princeton University.
$3
645579
773
0
$t
Dissertation Abstracts International
$g
61-08B.
790
1 0
$a
Bialek, William,
$e
advisor
790
$a
0181
791
$a
Ph.D.
792
$a
2000
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=9981559
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9176114
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入
(1)帳號:一般為「身分證號」;外籍生或交換生則為「學號」。 (2)密碼:預設為帳號末四碼。
帳號
.
密碼
.
請在此電腦上記得個人資料
取消
忘記密碼? (請注意!您必須已在系統登記E-mail信箱方能使用。)