Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Some perspectives of sparse statisti...
~
Zou, Hui.
Linked to FindBook
Google Book
Amazon
博客來
Some perspectives of sparse statistical modeling.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Some perspectives of sparse statistical modeling./
Author:
Zou, Hui.
Description:
101 p.
Notes:
Source: Dissertation Abstracts International, Volume: 66-08, Section: B, page: 4310.
Contained By:
Dissertation Abstracts International66-08B.
Subject:
Statistics. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3186437
ISBN:
9780542287398
Some perspectives of sparse statistical modeling.
Zou, Hui.
Some perspectives of sparse statistical modeling.
- 101 p.
Source: Dissertation Abstracts International, Volume: 66-08, Section: B, page: 4310.
Thesis (Ph.D.)--Stanford University, 2005.
In this thesis we develop some new sparse modeling techniques and related theory. We first point out the fundamental drawbacks of the lasso in some scenarios: (1) the number of predictors (greatly) exceeds the number of observations; (2) the predictors are highly correlated and form "groups". A typical example where these scenarios naturally occur is the gene selection problem in microarray analysis. We then propose the elastic net, a new regularization and variable selection method, to improve upon the lasso. In this domain we show that the elastic net often outperforms the lasso, while enjoying a similar sparsity of representation. In addition, the elastic net encourages a grouping effect, where strongly correlated predictors tend to be in or out of the model together. The elastic net is particularly useful when the number of predictors is much bigger that the number of samples. We also propose an algorithm called LARS-EN for efficiently computing the entire elastic-net regularization path, much like the LARS algorithm does for the lasso.
ISBN: 9780542287398Subjects--Topical Terms:
517247
Statistics.
Some perspectives of sparse statistical modeling.
LDR
:03016nmm 2200289 4500
001
1827998
005
20061228142248.5
008
130610s2005 eng d
020
$a
9780542287398
035
$a
(UnM)AAI3186437
035
$a
AAI3186437
040
$a
UnM
$c
UnM
100
1
$a
Zou, Hui.
$3
1916910
245
1 0
$a
Some perspectives of sparse statistical modeling.
300
$a
101 p.
500
$a
Source: Dissertation Abstracts International, Volume: 66-08, Section: B, page: 4310.
500
$a
Adviser: Trevor Hastie.
502
$a
Thesis (Ph.D.)--Stanford University, 2005.
520
$a
In this thesis we develop some new sparse modeling techniques and related theory. We first point out the fundamental drawbacks of the lasso in some scenarios: (1) the number of predictors (greatly) exceeds the number of observations; (2) the predictors are highly correlated and form "groups". A typical example where these scenarios naturally occur is the gene selection problem in microarray analysis. We then propose the elastic net, a new regularization and variable selection method, to improve upon the lasso. In this domain we show that the elastic net often outperforms the lasso, while enjoying a similar sparsity of representation. In addition, the elastic net encourages a grouping effect, where strongly correlated predictors tend to be in or out of the model together. The elastic net is particularly useful when the number of predictors is much bigger that the number of samples. We also propose an algorithm called LARS-EN for efficiently computing the entire elastic-net regularization path, much like the LARS algorithm does for the lasso.
520
$a
In the second part of the thesis, we propose a principled approach called SPCA for modifying PCA based on a novel sparse PCA criterion, in which an elastic net constraint is used to produce sparse loadings. To solve the optimization problem in SPCA, we consider an alternating algorithm which iterates between the elastic net and the reduced-rank Procrustes rotation. SPCA allows flexible control of the sparse structure of the resulting loadings and has the ability of identifying important variables.
520
$a
In the third part of the thesis, we study the degrees of freedom of the lasso in the framework of SURE theory. We prove that the number of non-zero coefficients is an unbiased estimate for the degrees of freedom of the lasso---a conclusion requiring no special assumption on the predictors. Our analysis also provides mathematical support for a related conjecture by Efron et al. (2004). As an application, various model selection criteria--- Cp, AIC and BIC---are defined, which, along with the LARS algorithm, provide a principled and efficient approach to obtaining the optimal Lasso fit.
590
$a
School code: 0212.
650
4
$a
Statistics.
$3
517247
690
$a
0463
710
2 0
$a
Stanford University.
$3
754827
773
0
$t
Dissertation Abstracts International
$g
66-08B.
790
1 0
$a
Hastie, Trevor,
$e
advisor
790
$a
0212
791
$a
Ph.D.
792
$a
2005
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3186437
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9218861
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login