Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Exploring the Scalability of Deep Le...
~
Williams, Taylor.
Linked to FindBook
Google Book
Amazon
博客來
Exploring the Scalability of Deep Learning on GPU Clusters.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Exploring the Scalability of Deep Learning on GPU Clusters./
Author:
Williams, Taylor.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2019,
Description:
132 p.
Notes:
Source: Masters Abstracts International, Volume: 80-07.
Contained By:
Masters Abstracts International80-07.
Subject:
Computer science. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13426089
ISBN:
9780438802285
Exploring the Scalability of Deep Learning on GPU Clusters.
Williams, Taylor.
Exploring the Scalability of Deep Learning on GPU Clusters.
- Ann Arbor : ProQuest Dissertations & Theses, 2019 - 132 p.
Source: Masters Abstracts International, Volume: 80-07.
Thesis (M.S.)--Trent University (Canada), 2019.
This item must not be added to any third party search indexes.
In recent years, we have observed an unprecedented rise in popularity of AI-powered systems. They have become ubiquitous in modern life, being used by countless people every day. Many of these AI systems are powered, entirely or partially, by deep learning models. From language translation to image recognition, deep learning models are being used to build systems with unprecedented accuracy. The primary downside, is the significant time required to train the models. Fortunately, the time needed for training the models is reduced through the use of GPUs rather than CPUs. However, with model complexity ever increasing, training times even with GPUs are on the rise. One possible solution to ever-increasing training times is to use parallelization to enable the distributed training of models on GPU clusters. This thesis investigates how to utilise clusters of GPU-accelerated nodes to achieve the best scalability possible, thus minimising model training times.
ISBN: 9780438802285Subjects--Topical Terms:
523869
Computer science.
Exploring the Scalability of Deep Learning on GPU Clusters.
LDR
:02056nmm a2200325 4500
001
2264235
005
20200423112920.5
008
220629s2019 ||||||||||||||||| ||eng d
020
$a
9780438802285
035
$a
(MiAaPQ)AAI13426089
035
$a
(MiAaPQ)trentu:10633
035
$a
AAI13426089
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Williams, Taylor.
$3
3280780
245
1 0
$a
Exploring the Scalability of Deep Learning on GPU Clusters.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2019
300
$a
132 p.
500
$a
Source: Masters Abstracts International, Volume: 80-07.
500
$a
Publisher info.: Dissertation/Thesis.
500
$a
Advisor: McConnell, Sabine.
502
$a
Thesis (M.S.)--Trent University (Canada), 2019.
506
$a
This item must not be added to any third party search indexes.
506
$a
This item must not be sold to any third party vendors.
520
$a
In recent years, we have observed an unprecedented rise in popularity of AI-powered systems. They have become ubiquitous in modern life, being used by countless people every day. Many of these AI systems are powered, entirely or partially, by deep learning models. From language translation to image recognition, deep learning models are being used to build systems with unprecedented accuracy. The primary downside, is the significant time required to train the models. Fortunately, the time needed for training the models is reduced through the use of GPUs rather than CPUs. However, with model complexity ever increasing, training times even with GPUs are on the rise. One possible solution to ever-increasing training times is to use parallelization to enable the distributed training of models on GPU clusters. This thesis investigates how to utilise clusters of GPU-accelerated nodes to achieve the best scalability possible, thus minimising model training times.
590
$a
School code: 0513.
650
4
$a
Computer science.
$3
523869
690
$a
0984
710
2
$a
Trent University (Canada).
$b
Applied Modeling and Quantitative Methods.
$3
2101153
773
0
$t
Masters Abstracts International
$g
80-07.
790
$a
0513
791
$a
M.S.
792
$a
2019
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13426089
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9416469
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login