語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Multi-step Prediction With Neural Ne...
~
Kelley, Joseph.
FindBook
Google Book
Amazon
博客來
Multi-step Prediction With Neural Networks.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Multi-step Prediction With Neural Networks./
作者:
Kelley, Joseph.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2024,
面頁冊數:
146 p.
附註:
Source: Dissertations Abstracts International, Volume: 86-01, Section: B.
Contained By:
Dissertations Abstracts International86-01B.
標題:
Electrical engineering. -
電子資源:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=31142080
ISBN:
9798383206300
Multi-step Prediction With Neural Networks.
Kelley, Joseph.
Multi-step Prediction With Neural Networks.
- Ann Arbor : ProQuest Dissertations & Theses, 2024 - 146 p.
Source: Dissertations Abstracts International, Volume: 86-01, Section: B.
Thesis (Ph.D.)--Oklahoma State University, 2024.
 This paper demonstrates methods for modeling nonlinear dynamic systems to perform multi-step prediction. These methods are suitable for the design of model predictive controllers (MPC), forecasting power plant loads, and forecasting products sold in stores. There are numerous techniques for modeling nonlinear system, but in the paper we concentrated on the iterative method [1] [2] [3] for multi-step prediction, which we refer to as the TMS-Iterative method. In the iterative method a single model is trained to minimize one-step prediction errors, then the trained model is used for multi-step prediction. But for the TMS-Iterative method a single model is trained to minimize the multi-step prediction errors. In [4] and [5] it was suggested the TMS-Iterative method produces more robust long-term predictions. We found this to be especially true when modeling nonlinear dynamic systems.The paper is comprised of 4 sections. Each section presents a different recurrent neural network (RNN) architecture to perform multi-step prediction using the TMS-Iterative method. The 4 sections are (1) nonlinear input-output models, (2) nonlinear state-space models, (3) sequence-to-sequence (Seq2Seq) models, and (4) seasonal nonlinear model. In the first section we provide an extensive comparison of Nonlinear AutoRegressive model with eXogenous inputs (NARX) and Nonlinear AutoRegressive Moving Average model with eXogenous inputs (NARMAX) for multi-step predictions. We discuss in detail how the NARMAX models provide significantly improved forecasts for shorter prediction horizons, while the predictions converge as the horizon increases. In the second section we introduce the Nonlinear Innovation Form (NIF) and Nonlinear Output Form (NOF), which are nonlinear state-space models. Both the NIF and NOF are based on the Kalman filter, which estimates the hidden states in an RNN model. By estimating the hidden states we can improve the short-term and long-term forecasting accuracy. In the third section we demonstrate Seq2Seq models are fixed-weight adaptive and can learn time-varying dynamics. Seq2Seq models were originally introduced for national language processing. In the paper we extend the Seq2Seq to the modeling of time-varying nonlinear systems. In the fourth section the Periodic Nonlinear AutoRegressive model (PNAR) is introduced. The PNAR model is trained to learn seasonal variations in nonlinear dynamic systems. Seasonal variations are patterns that occur at consistent time intervals in a predictable fashion. In each section, the RNN models are compared using simulated and experimental data.
ISBN: 9798383206300Subjects--Topical Terms:
649834
Electrical engineering.
Subjects--Index Terms:
Multi-step prediction
Multi-step Prediction With Neural Networks.
LDR
:03818nmm a2200421 4500
001
2400113
005
20240924101907.5
006
m o d
007
cr#unu||||||||
008
251215s2024 ||||||||||||||||| ||eng d
020
$a
9798383206300
035
$a
(MiAaPQ)AAI31142080
035
$a
AAI31142080
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Kelley, Joseph.
$3
3770082
245
1 0
$a
Multi-step Prediction With Neural Networks.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2024
300
$a
146 p.
500
$a
Source: Dissertations Abstracts International, Volume: 86-01, Section: B.
500
$a
Advisor: Hagan, Martin.
502
$a
Thesis (Ph.D.)--Oklahoma State University, 2024.
520
$a
 This paper demonstrates methods for modeling nonlinear dynamic systems to perform multi-step prediction. These methods are suitable for the design of model predictive controllers (MPC), forecasting power plant loads, and forecasting products sold in stores. There are numerous techniques for modeling nonlinear system, but in the paper we concentrated on the iterative method [1] [2] [3] for multi-step prediction, which we refer to as the TMS-Iterative method. In the iterative method a single model is trained to minimize one-step prediction errors, then the trained model is used for multi-step prediction. But for the TMS-Iterative method a single model is trained to minimize the multi-step prediction errors. In [4] and [5] it was suggested the TMS-Iterative method produces more robust long-term predictions. We found this to be especially true when modeling nonlinear dynamic systems.The paper is comprised of 4 sections. Each section presents a different recurrent neural network (RNN) architecture to perform multi-step prediction using the TMS-Iterative method. The 4 sections are (1) nonlinear input-output models, (2) nonlinear state-space models, (3) sequence-to-sequence (Seq2Seq) models, and (4) seasonal nonlinear model. In the first section we provide an extensive comparison of Nonlinear AutoRegressive model with eXogenous inputs (NARX) and Nonlinear AutoRegressive Moving Average model with eXogenous inputs (NARMAX) for multi-step predictions. We discuss in detail how the NARMAX models provide significantly improved forecasts for shorter prediction horizons, while the predictions converge as the horizon increases. In the second section we introduce the Nonlinear Innovation Form (NIF) and Nonlinear Output Form (NOF), which are nonlinear state-space models. Both the NIF and NOF are based on the Kalman filter, which estimates the hidden states in an RNN model. By estimating the hidden states we can improve the short-term and long-term forecasting accuracy. In the third section we demonstrate Seq2Seq models are fixed-weight adaptive and can learn time-varying dynamics. Seq2Seq models were originally introduced for national language processing. In the paper we extend the Seq2Seq to the modeling of time-varying nonlinear systems. In the fourth section the Periodic Nonlinear AutoRegressive model (PNAR) is introduced. The PNAR model is trained to learn seasonal variations in nonlinear dynamic systems. Seasonal variations are patterns that occur at consistent time intervals in a predictable fashion. In each section, the RNN models are compared using simulated and experimental data.
590
$a
School code: 0664.
650
4
$a
Electrical engineering.
$3
649834
650
4
$a
Computer engineering.
$3
621879
650
4
$a
Computer science.
$3
523869
650
4
$a
Systems science.
$3
3168411
650
4
$a
Information technology.
$3
532993
653
$a
Multi-step prediction
653
$a
NARMAX models
653
$a
NARX
653
$a
Recurrent neural network
653
$a
Seq2Seq models
653
$a
Time-series modeling
690
$a
0544
690
$a
0489
690
$a
0984
690
$a
0464
690
$a
0790
710
2
$a
Oklahoma State University.
$b
Electrical Engineering.
$3
1030588
773
0
$t
Dissertations Abstracts International
$g
86-01B.
790
$a
0664
791
$a
Ph.D.
792
$a
2024
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=31142080
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9508433
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入