Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Architecture optimization, training ...
~
Wang, Xiaoyu.
Linked to FindBook
Google Book
Amazon
博客來
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network.
Record Type:
Language materials, printed : Monograph/item
Title/Author:
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network./
Author:
Wang, Xiaoyu.
Description:
199 p.
Notes:
Source: Dissertation Abstracts International, Volume: 71-05, Section: B, page: 3131.
Contained By:
Dissertation Abstracts International71-05B.
Subject:
Engineering, Mechanical. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3402564
ISBN:
9781109748215
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network.
Wang, Xiaoyu.
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network.
- 199 p.
Source: Dissertation Abstracts International, Volume: 71-05, Section: B, page: 3131.
Thesis (Ph.D.)--Clemson University, 2010.
Recurrent neural networks (RNN) have been rapidly developed in recent years. Applications of RNN can be found in system identification, optimization, image processing, pattern reorganization, classification, clustering, memory association, etc.
ISBN: 9781109748215Subjects--Topical Terms:
783786
Engineering, Mechanical.
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network.
LDR
:03543nam 2200373 4500
001
1392738
005
20110218131343.5
008
130515s2010 ||||||||||||||||| ||eng d
020
$a
9781109748215
035
$a
(UMI)AAI3402564
035
$a
AAI3402564
040
$a
UMI
$c
UMI
100
1
$a
Wang, Xiaoyu.
$3
1280728
245
1 0
$a
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network.
300
$a
199 p.
500
$a
Source: Dissertation Abstracts International, Volume: 71-05, Section: B, page: 3131.
500
$a
Adviser: Yong Huang.
502
$a
Thesis (Ph.D.)--Clemson University, 2010.
520
$a
Recurrent neural networks (RNN) have been rapidly developed in recent years. Applications of RNN can be found in system identification, optimization, image processing, pattern reorganization, classification, clustering, memory association, etc.
520
$a
In this study, an optimized RNN is proposed to model nonlinear dynamical systems. A fully connected RNN is developed first which is modified from a fully forward connected neural network (FFCNN) by accommodating recurrent connections among its hidden neurons. In addition, a destructive structure optimization algorithm is applied and the extended Kalman filter (EKF) is adopted as a network's training algorithm. These two algorithms can seamlessly work together to generate the optimized RNN. The enhancement of the modeling performance of the optimized network comes from three parts: (1) its prototype - the FFCNN has advantages over multilayer perceptron network (MLP), the most widely used network, in terms of modeling accuracy and generalization ability; (2) the recurrency in RNN network make it more capable of modeling non-linear dynamical systems; and (3) the structure optimization algorithm further improves RNN's modeling performance in generalization ability and robustness.
520
$a
Performance studies of the proposed network are highlighted in training convergence and robustness. For the training convergence study, the Lyapunov method is used to adapt some training parameters to guarantee the training convergence, while the maximum likelihood method is used to estimate some other parameters to accelerate the training process. In addition, robustness analysis is conducted to develop a robustness measure considering uncertainties propagation through RNN via unscented transform.
520
$a
Two case studies, the modeling of a benchmark non-linear dynamical system and a tool wear progression in hard turning, are carried out to testify the development in this dissertation.
520
$a
The work detailed in this dissertation focuses on the creation of: (1) a new method to prove/guarantee the training convergence of RNN, and (2) a new method to quantify the robustness of RNN using uncertainty propagation analysis. With the proposed study, RNN and related algorithms are developed to model nonlinear dynamical system which can benefit modeling applications such as the condition monitoring studies in terms of robustness and accuracy in the future.
590
$a
School code: 0050.
650
4
$a
Engineering, Mechanical.
$3
783786
650
4
$a
Artificial Intelligence.
$3
769149
650
4
$a
Computer Science.
$3
626642
690
$a
0548
690
$a
0800
690
$a
0984
710
2
$a
Clemson University.
$b
Mechanical Engineering.
$3
1023734
773
0
$t
Dissertation Abstracts International
$g
71-05B.
790
1 0
$a
Huang, Yong,
$e
advisor
790
1 0
$a
Gowdy, John
$e
committee member
790
1 0
$a
Jalili, Nader
$e
committee member
790
1 0
$a
Vahidi, Ardalan
$e
committee member
790
$a
0050
791
$a
Ph.D.
792
$a
2010
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3402564
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9155877
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login