語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
到查詢結果
[ subject:"Artificial Intelligence." ]
切換:
標籤
|
MARC模式
|
ISBD
Stacked graphical learning for text ...
~
Kou, Zhenzhen.
FindBook
Google Book
Amazon
博客來
Stacked graphical learning for text mining.
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
Stacked graphical learning for text mining./
作者:
Kou, Zhenzhen.
面頁冊數:
130 p.
附註:
Adviser: William W. Cohen.
Contained By:
Dissertation Abstracts International69-02B.
標題:
Artificial Intelligence. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3303001
ISBN:
9780549491903
Stacked graphical learning for text mining.
Kou, Zhenzhen.
Stacked graphical learning for text mining.
- 130 p.
Adviser: William W. Cohen.
Thesis (Ph.D.)--Carnegie Mellon University, 2008.
In reality there are many relational datasets in which both features of instances and the relationships among the instances are recorded, such as hyperlinked web pages, scientific literature with citations, and social net works. Collective classification has been widely used to classify a group of related instances simultaneously. Recently there have been several studies on statistical relational learning for collective classification, including relational dependency networks, relational Markov networks and Markov logic networks. In statistical relational learning models, collective classification is usually formulated as an inference problem over graphical models. Hence the existing collective classification methods are expensive due to the iterative inference procedure required for general graphical models. Procedures that learn collective classifiers are also expensive, especially if they are based on iterative optimization of an expensive iterative inference procedure. Our goal is to develop an efficient model for collective classification for relational datasets.
ISBN: 9780549491903Subjects--Topical Terms:
769149
Artificial Intelligence.
Stacked graphical learning for text mining.
LDR
:04029nam 2200313 a 45
001
959023
005
20110704
008
110704s2008 ||||||||||||||||| ||eng d
020
$a
9780549491903
035
$a
(UMI)AAI3303001
035
$a
AAI3303001
040
$a
UMI
$c
UMI
100
1
$a
Kou, Zhenzhen.
$3
1282491
245
1 0
$a
Stacked graphical learning for text mining.
300
$a
130 p.
500
$a
Adviser: William W. Cohen.
500
$a
Source: Dissertation Abstracts International, Volume: 69-02, Section: B, page: 1091.
502
$a
Thesis (Ph.D.)--Carnegie Mellon University, 2008.
520
$a
In reality there are many relational datasets in which both features of instances and the relationships among the instances are recorded, such as hyperlinked web pages, scientific literature with citations, and social net works. Collective classification has been widely used to classify a group of related instances simultaneously. Recently there have been several studies on statistical relational learning for collective classification, including relational dependency networks, relational Markov networks and Markov logic networks. In statistical relational learning models, collective classification is usually formulated as an inference problem over graphical models. Hence the existing collective classification methods are expensive due to the iterative inference procedure required for general graphical models. Procedures that learn collective classifiers are also expensive, especially if they are based on iterative optimization of an expensive iterative inference procedure. Our goal is to develop an efficient model for collective classification for relational datasets.
520
$a
In my thesis, I have studied a learning scheme called slacked graphical learning. In stacked graphical learning, a base learner is augmented by providing the predicted labels of relevant instances. That is, first, a base learner is applied to the training data to make predictions using a cross validation-like technique. Then we expand the features by adding the predictions of relevant examples into the feature vector. Finally the base learner is applied to the expanded feature sets to make the final predictions. The intuition behind stacked graphical learning is that combining the predictions on the neighbors with local features can capture the dependencies among examples; hence we can rely on the base learner to classify the instances using the expanded feature set.
520
$a
We have applied stacked graphical learning to many real problems including collective classification, sequential partitioning, information extraction, and multi-task problems in an information extraction system. Stacked graphical learning has been demonstrated to achieve competitive performance to the state-of-art relational graphical models with much less inference time.
520
$a
In addition to exploring many applications of stacked graphical learning in real problems, we formally analyze an idealized version of the algorithm, which can he formulate as an inhomogeneous Gibbs sampling process with parameters learned in a greedy manner, provide proof of convergence of the idealized version of stacking, and discuss the conditions under which the algorithm of stacked graphical learning is nearly identical to the idealized stacked graphical learning.
520
$a
We also studied an online version of stacked graphical learning, which integrates a single-pass online learning algorithm Modified Balanced Window with stacked learning. Online stacked graphical learning can save training time and is capable to handle large streaming datasets with minimal memory overhead. We analyze the time and memory cost of online stacked graphical learning and applied it to several real problems.
590
$a
School code: 0041.
650
4
$a
Artificial Intelligence.
$3
769149
690
$a
0800
710
2
$a
Carnegie Mellon University.
$3
1018096
773
0
$t
Dissertation Abstracts International
$g
69-02B.
790
$a
0041
790
1 0
$a
Cohen, William W.,
$e
advisor
791
$a
Ph.D.
792
$a
2008
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3303001
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9122488
電子資源
11.線上閱覽_V
電子書
EB W9122488
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入