Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Dynamic Sparsity for Efficient Machi...
~
Liu, Zichang.
Linked to FindBook
Google Book
Amazon
博客來
Dynamic Sparsity for Efficient Machine Learning.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Dynamic Sparsity for Efficient Machine Learning./
Author:
Liu, Zichang.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2024,
Description:
145 p.
Notes:
Source: Dissertations Abstracts International, Volume: 86-02, Section: B.
Contained By:
Dissertations Abstracts International86-02B.
Subject:
Computer science. -
Online resource:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=31532859
ISBN:
9798383417386
Dynamic Sparsity for Efficient Machine Learning.
Liu, Zichang.
Dynamic Sparsity for Efficient Machine Learning.
- Ann Arbor : ProQuest Dissertations & Theses, 2024 - 145 p.
Source: Dissertations Abstracts International, Volume: 86-02, Section: B.
Thesis (Ph.D.)--Rice University, 2024.
Over the past decades, machine learning (ML) models have delivered remarkable accomplishments in various applications. For example, large language models usher in a new wave of excitement in artificial intelligence. Interestingly, these accomplishments also unveil the scaling law in machine learning: larger models, equipped with more parameters and trained on more extensive datasets, often significantly outperform their smaller counterparts. However, the trends of increasing model size inevitably introduce unprecedented computation resource requirements, creating substantial challenges in model training and deployments.This thesis aims to improve the efficiency of ML models through algorithmic advancements. Specifically, we exploit the dynamic sparsity pattern inside ML models to achieve efficiency goals. Dynamic sparsity refers to the subset of parameters or activations that are important for a certain data, and different data may have a different dynamic sparsity pattern. We advocate identifying the dynamic sparsity pattern for each data set and focusing computation and memory resources on it.The first part of this thesis centers around the inference stage. We verify the existence of dynamic sparsity in trained ML models, namely, within the classification layer, attention mechanism, and transformer layers of trained models. Further, we demonstrate that such dynamic sparsity can be cheaply predicted and leveraged for each data to improve the inference efficiency goals. The subsequent part of the dissertation will shift its focus to the training stage, where dynamic sparsity emerges as a tool to mitigate the problem of catastrophic forgetting or data heterogeneity in federated learning to improve training efficiency.
ISBN: 9798383417386Subjects--Topical Terms:
523869
Computer science.
Subjects--Index Terms:
Machine learning
Dynamic Sparsity for Efficient Machine Learning.
LDR
:02889nmm a2200397 4500
001
2402282
005
20241028051517.5
006
m o d
007
cr#unu||||||||
008
251215s2024 ||||||||||||||||| ||eng d
020
$a
9798383417386
035
$a
(MiAaPQ)AAI31532859
035
$a
(MiAaPQ)0187rice5035Liu
035
$a
AAI31532859
035
$a
2402282
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Liu, Zichang.
$3
3772504
245
1 0
$a
Dynamic Sparsity for Efficient Machine Learning.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2024
300
$a
145 p.
500
$a
Source: Dissertations Abstracts International, Volume: 86-02, Section: B.
500
$a
Advisor: Shrivastava, Anshumali.
502
$a
Thesis (Ph.D.)--Rice University, 2024.
520
$a
Over the past decades, machine learning (ML) models have delivered remarkable accomplishments in various applications. For example, large language models usher in a new wave of excitement in artificial intelligence. Interestingly, these accomplishments also unveil the scaling law in machine learning: larger models, equipped with more parameters and trained on more extensive datasets, often significantly outperform their smaller counterparts. However, the trends of increasing model size inevitably introduce unprecedented computation resource requirements, creating substantial challenges in model training and deployments.This thesis aims to improve the efficiency of ML models through algorithmic advancements. Specifically, we exploit the dynamic sparsity pattern inside ML models to achieve efficiency goals. Dynamic sparsity refers to the subset of parameters or activations that are important for a certain data, and different data may have a different dynamic sparsity pattern. We advocate identifying the dynamic sparsity pattern for each data set and focusing computation and memory resources on it.The first part of this thesis centers around the inference stage. We verify the existence of dynamic sparsity in trained ML models, namely, within the classification layer, attention mechanism, and transformer layers of trained models. Further, we demonstrate that such dynamic sparsity can be cheaply predicted and leveraged for each data to improve the inference efficiency goals. The subsequent part of the dissertation will shift its focus to the training stage, where dynamic sparsity emerges as a tool to mitigate the problem of catastrophic forgetting or data heterogeneity in federated learning to improve training efficiency.
590
$a
School code: 0187.
650
4
$a
Computer science.
$3
523869
650
4
$a
Systems science.
$3
3168411
653
$a
Machine learning
653
$a
Large language model
653
$a
Sparsity
653
$a
ML models
690
$a
0984
690
$a
0800
690
$a
0790
710
2
$a
Rice University.
$b
Computer Science.
$3
3345731
773
0
$t
Dissertations Abstracts International
$g
86-02B.
790
$a
0187
791
$a
Ph.D.
792
$a
2024
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=31532859
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9510602
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login