Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Enhancing Transformer-Based Architec...
~
Bachyrycz, Emma.
Linked to FindBook
Google Book
Amazon
博客來
Enhancing Transformer-Based Architectures With Temporal Hierarchies for Time-Series Forecasting.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Enhancing Transformer-Based Architectures With Temporal Hierarchies for Time-Series Forecasting./
Author:
Bachyrycz, Emma.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2024,
Description:
108 p.
Notes:
Source: Masters Abstracts International, Volume: 85-12.
Contained By:
Masters Abstracts International85-12.
Subject:
Systems science. -
Online resource:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=31244375
ISBN:
9798383099575
Enhancing Transformer-Based Architectures With Temporal Hierarchies for Time-Series Forecasting.
Bachyrycz, Emma.
Enhancing Transformer-Based Architectures With Temporal Hierarchies for Time-Series Forecasting.
- Ann Arbor : ProQuest Dissertations & Theses, 2024 - 108 p.
Source: Masters Abstracts International, Volume: 85-12.
Thesis (M.S.)--State University of New York at Binghamton, 2024.
This research proposes an enhanced transformer model for time-series forecasting to align the various levels of decision-making. Transformer models were initially proposed in 2017 for natural language processing and have gained popularity since, notably with the release of ChatGPT. This architecture is unique with its parallel multi-head attention mechanism that addresses the concerns of the popular deep learning model recurrent neural networks. Transformer mechanisms can handle large datasets without experiencing the vanishing gradient problem, aligning with the rise of the Industry 4.0 Revolution and Big Data. In this research we explore the opportunities of transformer models in time-series forecasting, offering enhancements using temporal hierarchies. This involves aggregating the data at different temporal scales, such as daily monthly, and yearly, and incorporating the hierarchical levels into the model's framework. By training the model on recurrent structures at different frequencies, our preliminary results encourage accurate, dynamic, and robust forecasting of the dataset that outstands alternatives. The various temporal hierarchies represent the levels of decision-making within an organization. The monthly dataset aligns with operational decisions frequently made by lower management. The quarterly dataset is the tactical level, which incorporates the medium-term goals of a company and is made by middle management. Finally, the yearly dataset is for the strategic-level decisions made by executives. The results show the model combining the monthly, quarterly, and yearly temporal hierarchies performed with the most accuracy, excelling above the benchmark models. With three attention heads and a forecasting horizon of two years, the model performed with an MSE of 1.741, sMAPE of 0.296%, and RMSE of 1.319. Under the same forecasting parameters, the second-highest-performing model is the forecast combining operational and tactical levels or monthly and quarterly data. This model performed with an MSE of 1.885, sMAPE of 0.329%, and RSME of 1.373.
ISBN: 9798383099575Subjects--Topical Terms:
3168411
Systems science.
Subjects--Index Terms:
Decision-making
Enhancing Transformer-Based Architectures With Temporal Hierarchies for Time-Series Forecasting.
LDR
:03235nmm a2200385 4500
001
2401938
005
20241022111611.5
006
m o d
007
cr#unu||||||||
008
251215s2024 ||||||||||||||||| ||eng d
020
$a
9798383099575
035
$a
(MiAaPQ)AAI31244375
035
$a
AAI31244375
035
$a
2401938
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Bachyrycz, Emma.
$3
3772156
245
1 0
$a
Enhancing Transformer-Based Architectures With Temporal Hierarchies for Time-Series Forecasting.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2024
300
$a
108 p.
500
$a
Source: Masters Abstracts International, Volume: 85-12.
500
$a
Advisor: Wang, Yong.
502
$a
Thesis (M.S.)--State University of New York at Binghamton, 2024.
520
$a
This research proposes an enhanced transformer model for time-series forecasting to align the various levels of decision-making. Transformer models were initially proposed in 2017 for natural language processing and have gained popularity since, notably with the release of ChatGPT. This architecture is unique with its parallel multi-head attention mechanism that addresses the concerns of the popular deep learning model recurrent neural networks. Transformer mechanisms can handle large datasets without experiencing the vanishing gradient problem, aligning with the rise of the Industry 4.0 Revolution and Big Data. In this research we explore the opportunities of transformer models in time-series forecasting, offering enhancements using temporal hierarchies. This involves aggregating the data at different temporal scales, such as daily monthly, and yearly, and incorporating the hierarchical levels into the model's framework. By training the model on recurrent structures at different frequencies, our preliminary results encourage accurate, dynamic, and robust forecasting of the dataset that outstands alternatives. The various temporal hierarchies represent the levels of decision-making within an organization. The monthly dataset aligns with operational decisions frequently made by lower management. The quarterly dataset is the tactical level, which incorporates the medium-term goals of a company and is made by middle management. Finally, the yearly dataset is for the strategic-level decisions made by executives. The results show the model combining the monthly, quarterly, and yearly temporal hierarchies performed with the most accuracy, excelling above the benchmark models. With three attention heads and a forecasting horizon of two years, the model performed with an MSE of 1.741, sMAPE of 0.296%, and RMSE of 1.319. Under the same forecasting parameters, the second-highest-performing model is the forecast combining operational and tactical levels or monthly and quarterly data. This model performed with an MSE of 1.885, sMAPE of 0.329%, and RSME of 1.373.
590
$a
School code: 0792.
650
4
$a
Systems science.
$3
3168411
650
4
$a
Industrial engineering.
$3
526216
653
$a
Decision-making
653
$a
Temporal hierarchies
653
$a
Time-series
653
$a
Transformer model
690
$a
0790
690
$a
0800
690
$a
0546
710
2
$a
State University of New York at Binghamton.
$b
Systems Science Industrial Engineering.
$3
2104041
773
0
$t
Masters Abstracts International
$g
85-12.
790
$a
0792
791
$a
M.S.
792
$a
2024
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=31244375
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9510258
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login