Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Predictive alternatives in Bayesian ...
~
Womack, Andrew.
Linked to FindBook
Google Book
Amazon
博客來
Predictive alternatives in Bayesian model selection.
Record Type:
Language materials, printed : Monograph/item
Title/Author:
Predictive alternatives in Bayesian model selection./
Author:
Womack, Andrew.
Description:
93 p.
Notes:
Source: Dissertation Abstracts International, Volume: 72-07, Section: B, page: .
Contained By:
Dissertation Abstracts International72-07B.
Subject:
Statistics. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3450874
ISBN:
9781124602745
Predictive alternatives in Bayesian model selection.
Womack, Andrew.
Predictive alternatives in Bayesian model selection.
- 93 p.
Source: Dissertation Abstracts International, Volume: 72-07, Section: B, page: .
Thesis (Ph.D.)--Washington University in St. Louis, 2011.
Model comparison and hypothesis testing is an integral part of all data analyses. In this thesis, I present two new families of information criteria that can be used to perform model comparison. In Chapter 1, I review the necessary background to motivate the thesis. Of particular interest is the role of priors for estimation and model comparison as well as the role that information theory can play in the latter. As we will see, many existing forms of model comparison can be viewed in an information theoretic manner, which motivates defining new families of criteria. In Chapter 2, I present the two new criteria and discuss their properties. The first criterion is based purely on posterior predictive densities and Kullback-Leibler divergences and decomposes into terms that describe the fit and complexity of the model. In this manner, it behaves similar to popular criteria, such as the AIC or the DIC. I then present the second family of criteria, which are a modification of the marginal distribution by an appropriate Renyi divergence. This modification of the marginal allows the investigator to use priors that reflect vague prior knowledge while not suffering the paradoxes that can arise from such priors. One particularly nice aspect of this family of criteria is that it subsumes the Bayes' factor as a special case and produces an infinite family of criteria that are asymptotically equivalent to the Bayes' factor. In this manner, the criteria can be modified to achieve certain goals in small samples while maintaining asymptotic consistency. I conclude the thesis with a short discussion of the computational difficulties that arise when using the criteria and explore possible ways to overcome them.
ISBN: 9781124602745Subjects--Topical Terms:
517247
Statistics.
Predictive alternatives in Bayesian model selection.
LDR
:02830nam 2200325 4500
001
1402522
005
20111102140021.5
008
130515s2011 ||||||||||||||||| ||eng d
020
$a
9781124602745
035
$a
(UMI)AAI3450874
035
$a
AAI3450874
040
$a
UMI
$c
UMI
100
1
$a
Womack, Andrew.
$3
1681716
245
1 0
$a
Predictive alternatives in Bayesian model selection.
300
$a
93 p.
500
$a
Source: Dissertation Abstracts International, Volume: 72-07, Section: B, page: .
500
$a
Adviser: Jefferson M. Gill.
502
$a
Thesis (Ph.D.)--Washington University in St. Louis, 2011.
520
$a
Model comparison and hypothesis testing is an integral part of all data analyses. In this thesis, I present two new families of information criteria that can be used to perform model comparison. In Chapter 1, I review the necessary background to motivate the thesis. Of particular interest is the role of priors for estimation and model comparison as well as the role that information theory can play in the latter. As we will see, many existing forms of model comparison can be viewed in an information theoretic manner, which motivates defining new families of criteria. In Chapter 2, I present the two new criteria and discuss their properties. The first criterion is based purely on posterior predictive densities and Kullback-Leibler divergences and decomposes into terms that describe the fit and complexity of the model. In this manner, it behaves similar to popular criteria, such as the AIC or the DIC. I then present the second family of criteria, which are a modification of the marginal distribution by an appropriate Renyi divergence. This modification of the marginal allows the investigator to use priors that reflect vague prior knowledge while not suffering the paradoxes that can arise from such priors. One particularly nice aspect of this family of criteria is that it subsumes the Bayes' factor as a special case and produces an infinite family of criteria that are asymptotically equivalent to the Bayes' factor. In this manner, the criteria can be modified to achieve certain goals in small samples while maintaining asymptotic consistency. I conclude the thesis with a short discussion of the computational difficulties that arise when using the criteria and explore possible ways to overcome them.
590
$a
School code: 0252.
650
4
$a
Statistics.
$3
517247
690
$a
0463
710
2
$a
Washington University in St. Louis.
$b
Mathematics.
$3
1681717
773
0
$t
Dissertation Abstracts International
$g
72-07B.
790
1 0
$a
Gill, Jefferson M.,
$e
advisor
790
1 0
$a
Chib, Siddhartha
$e
committee member
790
1 0
$a
Greenberg, Edward
$e
committee member
790
1 0
$a
Lin, Nan
$e
committee member
790
1 0
$a
Spitznagel, Edward
$e
committee member
790
1 0
$a
Wickerhauser, Victor
$e
committee member
790
$a
0252
791
$a
Ph.D.
792
$a
2011
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3450874
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9165661
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login